A recent Linux kernel update to workaround CVE-2017-1000366 is causing Apache Hadoop’s secure DataNode (and NFS manager) to crash on startup. (Related discussion from Red Hat and Ubuntu)
If your systems are running a variant of Apache Hadoop 3.x, you can take advantage of user functions to workaround the Java Invocation API issue that causes jsvc to crash.
Step 1: If it doesn’t already exist, create ${HADOOP_CONF_DIR}/hadoop-user-functions.sh.
Step 2: Inside this file, add this function:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
function hadoop_start_secure_daemon { # this is used to launch a secure daemon in the *foreground* # local daemonname=$1 local class=$2 # pid file to create for our daemon local daemonpidfile=$3 # where to send stdout. jsvc has bad habits so this *may* be &1 # which means you send it to stdout! local daemonoutfile=$4 # where to send stderr. same thing, except &2 = stderr local daemonerrfile=$5 local privpidfile=$6 shift 6 hadoop_rotate_log "${daemonoutfile}" hadoop_rotate_log "${daemonerrfile}" # shellcheck disable=SC2153 jsvc="${JSVC_HOME}/jsvc" if [[ ! -f "${jsvc}" ]]; then hadoop_error "JSVC_HOME is not set or set incorrectly. jsvc is required to run secure" hadoop_error "or privileged daemons. Please download and install jsvc from " hadoop_error "http://archive.apache.org/dist/commons/daemon/binaries/ " hadoop_error "and set JSVC_HOME to the directory containing the jsvc binary." exit 1 fi # note that shellcheck will throw a # bogus for-our-use-case 2086 here. # it doesn't properly support multi-line situations hadoop_debug "Final CLASSPATH: ${CLASSPATH}" hadoop_debug "Final HADOOP_OPTS: ${HADOOP_OPTS}" hadoop_debug "Final JSVC_HOME: ${JSVC_HOME}" hadoop_debug "jsvc: ${jsvc}" hadoop_debug "Class name: ${class}" hadoop_debug "Command line options: $*" #shellcheck disable=SC2086 echo $$ &> "${privpidfile}" 2&> /dev/null if [[ $? -gt 0 ]]; then hadoop_error "ERROR: Cannot write ${daemonname} pid ${privpidfile}." fi # shellcheck disable=SC2086 exec "${jsvc}" \ "-Dproc_${daemonname}" \ -outfile "${daemonoutfile}" \ -errfile "${daemonerrfile}" \ -pidfile "${daemonpidfile}" \ -Xss1280k \ -nodetach \ -user "${HADOOP_SECURE_USER}" \ -cp "${CLASSPATH}" \ ${HADOOP_OPTS} \ "${class}" "$@" } |
This code is the same version as appears in ${HADOOP_HOME}/libexec/hadoop-functions.sh in 3.0.0-alpha3, with the slight addition of a -Xss1280k to grow the stack at launch.
Step 3: Save the file and start your secure DataNode (or NFS Manager) daemon.