Support Questions

Find answers, ask questions, and share your expertise

Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster

avatar
New Contributor

I have installed HDP 2.4 and running simple sqoop query,It is giving below error

16/05/13 22:00:22 INFO mapreduce.Job: Job job_1463191168836_0001 failed with state FAILED due to: Application application_1463191168836_0001 failed 2 times due to AM Container for appattempt_1463191168836_0001_000002 exited with exitCode: 1 For more detailed output, check application tracking page:http://jcia9335:8088/cluster/app/application_1463191168836_0001Then, click on links to logs of each attempt. Diagnostics: Exception from container-launch. Container id: container_1463191168836_0001_02_000001 Exit code: 1 Stack trace: ExitCodeException exitCode=1: at org.apache.hadoop.util.Shell.runCommand(Shell.java:576) at org.apache.hadoop.util.Shell.run(Shell.java:487) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:303) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Container exited with a non-zero exit code 1 Failing this attempt. Failing the application.

After checking the YARN logs it is showing the Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster

Below is complete YARN log

export HADOOP_CONF_DIR="/etc/hadoop/conf" export MAX_APP_ATTEMPTS="2" export JAVA_HOME="/usr/java/default" export APP_SUBMIT_TIME_ENV="1463191215817" export NM_HOST="jcia9335" export HADOOP_CLASSPATH="$PWD:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:$PWD/*:::/usr/hdp/2.4.2.0-258//sqoop/conf:/etc/zookeeper/conf::/usr/hdp/2.4.2.0-258//sqoop/lib/ant-contrib-1.0b3.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/avro-1.7.5.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/avro-mapred-1.7.5-hadoop2.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/commons-io-1.4.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/commons-jexl-2.1.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/commons-logging-1.1.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/hsqldb-1.8.0.10.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/jackson-annotations-2.3.0.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/jackson-core-2.3.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/jackson-databind-2.3.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/kite-data-core-1.0.0.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/kite-data-hive-1.0.0.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/kite-data-mapreduce-1.0.0.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/kite-hadoop-compatibility-1.0.0.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/ojdbc6.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/opencsv-2.3.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-avro-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-column-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-common-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-encoding-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-format-2.0.0.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-generator-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-hadoop-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/parquet-jackson-1.4.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/slf4j-api-1.6.1.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/snappy-java-1.0.5.jar:/usr/hdp/2.4.2.0-258//sqoop/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hbase/conf:/usr/java/default/lib/tools.jar:/usr/hdp/2.4.2.0-258/hbase:/usr/hdp/2.4.2.0-258/hbase/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hbase/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hbase/lib/asm-3.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-codec-1.9.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-el-1.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-logging-1.2.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-math-2.2.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/disruptor-3.3.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hbase/lib/guava-12.0.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-annotations-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-annotations-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-annotations.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-client-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-client.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-common-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-common-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-common.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-examples-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-examples.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-hadoop2-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-hadoop2-compat.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-hadoop-compat.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-it-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-it-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-it.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-prefix-tree-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-prefix-tree.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-procedure-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-procedure.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-protocol.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-resource-bundle-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-resource-bundle.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-rest-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-rest.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-server-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-server-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-server.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-shell-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-shell.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-thrift-1.1.2.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/hbase-thrift.jar:/usr/hdp/2.4.2.0-258/hbase/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hbase/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hbase/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jamon-runtime-2.3.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jasper-compiler-5.5.23.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jasper-runtime-5.5.23.jar:/usr/hdp/2.4.2.0-258/hbase/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jcodings-1.0.8.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jettison-1.3.3.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hbase/lib/joni-2.1.2.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jruby-complete-1.6.8.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jsp-2.1-6.1.14.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/2.4.2.0-258/hbase/lib/jsr305-1.3.9.jar:/usr/hdp/2.4.2.0-258/hbase/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hbase/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hbase/lib/libthrift-0.9.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hbase/lib/metrics-core-2.2.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/netty-3.2.4.Final.jar:/usr/hdp/2.4.2.0-258/hbase/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.4.2.0-258/hbase/lib/ojdbc6.jar:/usr/hdp/2.4.2.0-258/hbase/lib/okhttp-2.4.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/okio-1.4.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/ranger-hbase-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/2.4.2.0-258/hbase/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hbase/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.4.2.0-258/hbase/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/spymemcached-2.11.6.jar:/usr/hdp/2.4.2.0-258/hbase/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.4.2.0-258/hbase/lib/xml-apis-1.3.04.jar:/usr/hdp/2.4.2.0-258/hbase/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hbase/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hbase/lib/zookeeper.jar:/usr/hdp/2.4.2.0-258/hadoop/conf:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop/.//*:/usr/hdp/2.4.2.0-258/hadoop-hdfs/./:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/*:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//*:/usr/hdp/2.4.2.0-258/hadoop/conf:/usr/hdp/2.4.2.0-258/hadoop/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/zookeeper/*:/usr/hdp/2.4.2.0-258/zookeeper/lib/*::/usr/hdp/2.4.2.0-258//sqoop/sqoop-1.4.6.2.4.2.0-258.jar:" export LD_LIBRARY_PATH="$PWD:$HADOOP_COMMON_HOME/lib/native" export HADOOP_HDFS_HOME="/usr/hdp/2.4.2.0-258/hadoop-hdfs" export LOGNAME="root" export JVM_PID="$" export PWD="/grid1/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001/container_1463191168836_0001_01_000001" export LOCAL_DIRS="/grid/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001,/grid1/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001" export APPLICATION_WEB_PROXY_BASE="/proxy/application_1463191168836_0001" export SHELL="/bin/bash" export NM_HTTP_PORT="8042" export LOG_DIRS="/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001" export NM_AUX_SERVICE_mapreduce_shuffle="AAAfkQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA= " export NM_PORT="45454" export USER="root" export HADOOP_YARN_HOME="/usr/hdp/2.4.2.0-258/hadoop-yarn" export CLASSPATH="$PWD:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*: $PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*: $PWD/mr-framework/hadoop/share/hadoop/common/*: $PWD/mr-framework/hadoop/share/hadoop/common/lib/*: $PWD/mr-framework/hadoop/share/hadoop/yarn/*: $PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*: $PWD/mr-framework/hadoop/share/hadoop/hdfs/*: $PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/etc/hadoop/conf/secure:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:$PWD/*" export HADOOP_TOKEN_FILE_LOCATION="/grid1/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001/container_1463191168836_0001_01_000001/container_tokens" export HOME="/home/" export CONTAINER_ID="container_1463191168836_0001_01_000001" export MALLOC_ARENA_MAX="4" ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/34/parquet-column-1.4.1.jar" "parquet-column-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/41/ojdbc6.jar" "ojdbc6.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/filecache/10/mapreduce.tar.gz" "mr-framework" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/16/parquet-jackson-1.4.1.jar" "parquet-jackson-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/24/commons-codec-1.4.jar" "commons-codec-1.4.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/20/parquet-avro-1.4.1.jar" "parquet-avro-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/30/jackson-core-2.3.1.jar" "jackson-core-2.3.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/37/kite-data-core-1.0.0.jar" "kite-data-core-1.0.0.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/13/paranamer-2.3.jar" "paranamer-2.3.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/36/avro-mapred-1.7.5-hadoop2.jar" "avro-mapred-1.7.5-hadoop2.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/40/slf4j-api-1.6.1.jar" "slf4j-api-1.6.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/19/sqoop-1.4.6.2.4.2.0-258.jar" "sqoop-1.4.6.2.4.2.0-258.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/38/commons-io-1.4.jar" "commons-io-1.4.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/17/kite-data-hive-1.0.0.jar" "kite-data-hive-1.0.0.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/23/parquet-hadoop-1.4.1.jar" "parquet-hadoop-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/21/jackson-core-asl-1.9.13.jar" "jackson-core-asl-1.9.13.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/25/hsqldb-1.8.0.10.jar" "hsqldb-1.8.0.10.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/11/parquet-generator-1.4.1.jar" "parquet-generator-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/29/kite-hadoop-compatibility-1.0.0.jar" "kite-hadoop-compatibility-1.0.0.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/39/parquet-common-1.4.1.jar" "parquet-common-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/15/opencsv-2.3.jar" "opencsv-2.3.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001/filecache/11/job.xml" "job.xml" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/32/jackson-databind-2.3.1.jar" "jackson-databind-2.3.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/33/ant-eclipse-1.0-jvm1.2.jar" "ant-eclipse-1.0-jvm1.2.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001/filecache/10/job.jar" "job.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/27/parquet-encoding-1.4.1.jar" "parquet-encoding-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi mkdir -p jobSubmitDir hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001/filecache/13/job.splitmetainfo" "jobSubmitDir/job.splitmetainfo" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/43/commons-jexl-2.1.1.jar" "commons-jexl-2.1.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi mkdir -p jobSubmitDir hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/appcache/application_1463191168836_0001/filecache/12/job.split" "jobSubmitDir/job.split" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/28/jackson-mapper-asl-1.9.13.jar" "jackson-mapper-asl-1.9.13.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/42/xz-1.0.jar" "xz-1.0.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/35/jackson-annotations-2.3.0.jar" "jackson-annotations-2.3.0.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/12/kite-data-mapreduce-1.0.0.jar" "kite-data-mapreduce-1.0.0.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid1/hadoop/yarn/local/usercache/root/filecache/31/avro-1.7.5.jar" "avro-1.7.5.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/26/snappy-java-1.0.5.jar" "snappy-java-1.0.5.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/10/ant-contrib-1.0b3.jar" "ant-contrib-1.0b3.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/22/commons-logging-1.1.1.jar" "commons-logging-1.1.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/18/parquet-format-2.0.0.jar" "parquet-format-2.0.0.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi ln -sf "/grid/hadoop/yarn/local/usercache/root/filecache/14/commons-compress-1.4.1.jar" "commons-compress-1.4.1.jar" hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi # Creating copy of launch script cp "launch_container.sh" "/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/launch_container.sh" chmod 640 "/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/launch_container.sh" # Determining directory contents echo "ls -l:" 1>"/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/directory.info" ls -l 1>>"/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/directory.info" echo "find -L . -maxdepth 5 -ls:" 1>>"/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/directory.info" find -L . -maxdepth 5 -ls 1>>"/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/directory.info" echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/directory.info" find -L . -maxdepth 5 -type l -ls 1>>"/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/directory.info" exec /bin/bash -c "$JAVA_HOME/bin/java -Djava.io.tmpdir=$PWD/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog -Xmx14745m org.apache.hadoop.mapreduce.v2.app.MRAppMaster 1>/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/stdout 2>/grid/hadoop/yarn/log/application_1463191168836_0001/container_1463191168836_0001_01_000001/stderr " hadoop_shell_errorcode=$? if [ $hadoop_shell_errorcode -ne 0 ] then exit $hadoop_shell_errorcode fi End of LogType:launch_container.sh LogType:stderr Log Upload Time:Fri May 13 22:00:23 -0400 2016 LogLength:88 Log Contents: Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster End of LogType:stderr LogType:stdout Log Upload Time:Fri May 13 22:00:23 -0400 2016 LogLength:0 Log Contents: End of LogType:stdout [root@jcia9335 hadoop-hdfs]# 

I see hadoop-mapreduce-client-app.jar included in HADOOP_CLASSPTH.

yarn-site.xml config

    <name>yarn.application.classpath</name>    <value>/etc/hadoop/conf,/usr/hdp/2.4.2.0-258/hadoop/*,/usr/hdp/2.4.2.0-258/hadoop/lib/*,/usr/hdp/2.4.2.0-258/hadoop-hdfs/*,/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/*,/usr/hdp/2.4.2.0-258/hadoop-yarn/*,/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*,/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*,/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*,/usr/hdp/current/hadoop-client/*,/usr/hdp/current/hadoop-client/lib/*,/usr/hdp/current/hadoop-hdfs-client/*,/usr/hdp/current/hadoop-hdfs-client/lib/*,/usr/hdp/current/hadoop-yarn-client/*,/usr/hdp/current/hadoop-yarn-client/lib/*,/usr/hdp/current/hadoop-mapreduce-client/*,/usr/hdp/current/hadoop-mapreduce-client/lib/*,/usr/hdp/current/hadoop-mapreduce-historyserver/*,/usr/hdp/current/hadoop-mapreduce-historyserver/lib/*</value>
  </property>

Can you help me to resolve this error???

1 REPLY 1

avatar
Master Mentor

@pedababu turlapati You have to look into the following settings. This was happened in my cluster when I did the manual install i,e non ambari install

http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_installing_manually_book/content/configur...

modify mapreduce.application.classpath - in the entries there should not be any breaks or gaps

<property> <name>mapreduce.application.classpath</name> <value>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure</value> </property>