Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

yarn.application.classpath in yarn-site.xml

avatar
New Contributor

Hortonworks Data Platform doc suggest

Copy yarn-site.xml from the companion files and modify: <property> <name>yarn.application.classpath</name> <value>$HADOOP_CONF_DIR,/usr/hdp/${hdp.version}/hadoop-client/*, /usr/hdp/${hdp.version}/hadoop-client/lib/*, /usr/hdp/${hdp.version}/hadoop-hdfs-client/*, /usr/hdp/${hdp.version}/hadoop-hdfs-client/lib/*, /usr/hdp/${hdp.version}/hadoop-yarn-client/*, /usr/hdp/${hdp.version}/hadoop-yarn-client/lib/*</value> </property>

but there is no such path like /usr/hdp/${hdp.version}/hadoop-client on the server. It is 2.3.2.0-2950 version

job submitted kept failing with errors:

Diagnostics: Exception from container-launch. Container id: container_1449690108073_0001_02_000001 Exit code: 1 Stack trace: ExitCodeException exitCode=1: at org.apache.hadoop.util.Shell.runCommand(Shell.java:576) at org.apache.hadoop.util.Shell.run(Shell.java:487) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Container exited with a non-zero exit code 1 Failing this attempt. Failing the application.

Is this failing related to the classpath?

Please help. Thank you!

4 REPLIES 4

avatar

John, is this on a HWX Sandbox?

How did you install hadoop?

"${hdp.version}" is populated at runtime with the actual version installed.

You should have the yarn directories already created:

ls -al /usr/hdp/current/hadoop-*

lrwxrwxrwx 1 root root 28 2015-10-27 12:30 /usr/hdp/current/hadoop-client -> /usr/hdp/2.3.2.0-2950/hadoop lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-hdfs-client -> /usr/hdp/2.3.2.0-2950/hadoop-hdfs lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-hdfs-datanode -> /usr/hdp/2.3.2.0-2950/hadoop-hdfs lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-hdfs-journalnode -> /usr/hdp/2.3.2.0-2950/hadoop-hdfs lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-hdfs-namenode -> /usr/hdp/2.3.2.0-2950/hadoop-hdfs lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-hdfs-nfs3 -> /usr/hdp/2.3.2.0-2950/hadoop-hdfs lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-hdfs-portmap -> /usr/hdp/2.3.2.0-2950/hadoop-hdfs lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-hdfs-secondarynamenode -> /usr/hdp/2.3.2.0-2950/hadoop-hdfs lrwxrwxrwx 1 root root 35 2015-10-27 12:30 /usr/hdp/current/hadoop-httpfs -> /usr/hdp/2.3.2.0-2950/hadoop-httpfs lrwxrwxrwx 1 root root 38 2015-10-27 12:30 /usr/hdp/current/hadoop-mapreduce-client -> /usr/hdp/2.3.2.0-2950/hadoop-mapreduce lrwxrwxrwx 1 root root 38 2015-10-27 12:30 /usr/hdp/current/hadoop-mapreduce-historyserver -> /usr/hdp/2.3.2.0-2950/hadoop-mapreduce lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-yarn-client -> /usr/hdp/2.3.2.0-2950/hadoop-yarn lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-yarn-nodemanager -> /usr/hdp/2.3.2.0-2950/hadoop-yarn lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-yarn-resourcemanager -> /usr/hdp/2.3.2.0-2950/hadoop-yarn lrwxrwxrwx 1 root root 33 2015-10-27 12:30 /usr/hdp/current/hadoop-yarn-timelineserver -> /usr/hdp/2.3.2.0-2950/hadoop-yarn

I think that the directories are there, and the error is caused by something else.

Can you get the yarn logs for application: 1449690108073_0001:

yarn logs -applicationId application_1449690108073_0001 > application.txt

see if you see an error in the logs?

avatar
New Contributor

William,

Thanks for taking time answering me.

It is installed through hdp.repo downloaded from HWX. It is not Sandbox. It is straight install per the Doc by HWX.

if ${hdp.version} is populated with real version, then the path must be wrong in the Doc. If it refers to "current", then it is correct. I don't see any where how ${hdp.version} is defined. On pg 47 it says you do not need to modify ${hdp.version}.

The same job was resubmitted, unfortunately the yarn logs -ApplicationId xxx returns :

/app-logs/root/logs/application_1449759159329_0001 does not exist.

Log aggregation has not completed or is not enabled.

Can the output below tell something?

Diagnostics: Exception from container-launch. Container id: container_1449690108073_0001_02_000001 Exit code: 1 Stack trace: ExitCodeException exitCode=1: at org.apache.hadoop.util.Shell.runCommand(Shell.java:576) at org.apache.hadoop.util.Shell.run(Shell.java:487) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Container exited with a non-zero exit code 1 Failing this attempt. Failing the application.

Thanks.

avatar

That error only says that the container crashed, but does not say why.

You may be right about the yarn classpath. What is the output of:

ls -al /usr/hdp/current/hadoop-*

??

avatar
Master Mentor

@John Smith are you still having issues with this? Can you accept best answer or provide your own solution?