Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

oozie shell workflow failing classpath MR

Highlighted

oozie shell workflow failing classpath MR

New Contributor

We have three docker containers with hadoop-master (hdfs+yarn+mr), hadoop-worker (idem) and oozie (with hadoop clients), all with HDP 2.6.1.0-129 VERSION. We run perfectly a MR job in hadoop-worker docker container, but when we try to run an oozie job that just writes an echo in output, before it checks yarn-site.xml, mapred-site.xml and oozie configuration, and always it returns the same anwser:

Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster

We have environments variables of HADOOP in all docker containers:

ENV HADOOP_CONF_DIR /etc/hadoop/conf

ENV HADOOP_COMMON_HOME /usr/hdp/2.6.1.0-129/hadoop

ENV HADOOP_HDFS_HOME /usr/hdp/2.6.1.0-129/hadoop-hdfs

ENV HADOOP_YARN_HOME /usr/hdp/2.6.1.0-129/hadoop-yarn

ENV HADOOP_MAPRED_HOME /usr/hdp/2.6.1.0-129/hadoop-mapreduce

Also we have the next configuration about classpath in hadoop settings files:

mapred-site.xml:

<property> <name>mapreduce.application.classpath</name> <value>/usr/hdp/2.6.1.0-129/hadoop-mapreduce/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*:/usr/hdp/2.6.1.0-129/hadoop/*:/usr/hdp/2.6.1.0-129/hadoop/lib/*:/usr/hdp/2.6.1.0-129/hadoop-yarn/*:/usr/hdp/2.6.1.0-129/hadoop-yarn/lib/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*:/usr/hdp/2.6.1.0-129/hadoop/lib/hadoop-lzo-0.6.0.2.6.1.0-129.jar:/etc/hadoop/conf/secure</value> </property>

<property> <name>mapreduce.application.framework.path</name> <value>/hdp/apps/2.6.1.0-129/mapreduce/mapreduce.tar.gz#yarn</value> </property>

yarn-site.xml:

<property> <name>yarn.application.classpath</name> <value>/etc/hadoop/conf,/usr/hdp/2.6.1.0-129/hadoop/*,/usr/hdp/2.6.1.0-129/hadoop/lib/*,/usr/hdp/2.6.1.0-129/hadoop-hdfs/*,/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*,/usr/hdp/2.6.1.0-129/hadoop-yarn/*,/usr/hdp/2.6.1.0-129/hadoop-yarn/lib/*,/usr/hdp/2.6.1.0-129/hadoop-mapreduce/*,/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*,/usr/hdp/current/hadoop-client/*,/usr/hdp/current/hadoop-client/lib/*,/usr/hdp/current/hadoop-hdfs-client/*,/usr/hdp/current/hadoop-hdfs-client/lib/*,/usr/hdp/current/hadoop-yarn-client/*,/usr/hdp/current/hadoop-yarn-client/lib/*,/usr/hdp/current/hadoop-mapreduce-client/*,/usr/hdp/current/hadoop-mapreduce-client/lib/*,/usr/hdp/current/hadoop-mapreduce-historyserver/*,/usr/hdp/current/hadoop-mapreduce-historyserver/lib/*</value> </property>

And in yarn logs of the Oozie job, it returns:

....

....

export PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/java/default/bin:/usr/java/default:/etc/hadoop/conf" export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop/conf"} export MAX_APP_ATTEMPTS="2" export JAVA_HOME=${JAVA_HOME:-"/usr/java/default"} export LANG="en_US.UTF-8" export APP_SUBMIT_TIME_ENV="1504266868287" export NM_HOST="hdp-hadoop-worker01" export HADOOP_CLASSPATH="$PWD:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:$PWD/*:null" export LD_LIBRARY_PATH="$PWD:$HADOOP_COMMON_HOME/lib/native" export LOGNAME="oozie" export JVM_PID="$" export PWD="/grid/hadoop/yarn/local/usercache/oozie/appcache/application_1504265510557_0001/container_1504265510557_0001_01_000001" export LOCAL_DIRS="/grid/hadoop/yarn/local/usercache/oozie/appcache/application_1504265510557_0001" export APPLICATION_WEB_PROXY_BASE="/proxy/application_1504265510557_0001" export SHELL="/bin/bash" export NM_HTTP_PORT="8042" export LOG_DIRS="/var/log/hadoop/yarn/application_1504265510557_0001/container_1504265510557_0001_01_000001" export NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA= " export NM_PORT="45454" export USER="oozie" export HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"} export CLASSPATH="$PWD:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/share/hadoop/common/*:$HADOOP_COMMON_HOME/share/hadoop/common/lib/*:$HADOOP_HDFS_HOME/share/hadoop/hdfs/*:$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*:$HADOOP_YARN_HOME/share/hadoop/yarn/*:$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:$PWD/*" export HADOOP_TOKEN_FILE_LOCATION="/grid/hadoop/yarn/local/usercache/oozie/appcache/application_1504265510557_0001/container_1504265510557_0001_01_000001/container_tokens" export NM_AUX_SERVICE_spark_shuffle="" export LOCAL_USER_DIRS="/grid/hadoop/yarn/local/usercache/oozie/" export HADOOP_HOME="/usr/hdp/2.6.1.0-129/hadoop" export HOME="/home/" export NM_AUX_SERVICE_spark2_shuffle="" export CONTAINER_ID="container_1504265510557_0001_01_000001" export MALLOC_ARENA_MAX="4"

...

...

Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster

What is the problem?

Thanks a lot,

Lara.

1 REPLY 1

Re: oozie shell workflow failing classpath MR

New Contributor

I got it also. mapreduce.application.classpath not work. how to fix it? help ,my env is hadoop-2.6.0

,

I also got that... mapreduce.application.classpath not work

Don't have an account?
Coming from Hortonworks? Activate your account here