Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

MapReduce job: java.io.FileNotFoundException: /hadoop/yarn/log/application_1499748738335_0010 ..

avatar
Contributor

Sandbox 2.6, trying to execute mr job from user yarn, the job failed:

[yarn@sandbox work]$ yarn jar testmr-1.0-SNAPSHOT.jar ParquetJob /dir1  outdir
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/avro-tools-1.8.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
17/07/11 19:57:20 INFO impl.TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
17/07/11 19:57:20 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8032
17/07/11 19:57:20 INFO input.FileInputFormat: Total input paths to process : 52
17/07/11 19:57:21 INFO mapreduce.JobSubmitter: number of splits:52
17/07/11 19:57:21 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1499748738335_0012
17/07/11 19:57:21 INFO impl.YarnClientImpl: Submitted application application_1499748738335_0012
17/07/11 19:57:21 INFO mapreduce.Job: The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1499748738335_0012/
17/07/11 19:57:21 INFO mapreduce.Job: Running job: job_1499748738335_0012
17/07/11 19:57:26 INFO mapreduce.Job: Job job_1499748738335_0012 running in uber mode : false
17/07/11 19:57:26 INFO mapreduce.Job:  map 0% reduce 0%
17/07/11 19:57:26 INFO mapreduce.Job: Job job_1499748738335_0012 failed with state FAILED due to: Application application_1499748738335_0012 failed 2 times due to AM Container for appattempt_1499748738335_0012_000002 exited with  exitCode: 1
For more detailed output, check the application tracking page: http://sandbox.hortonworks.com:8088/cluster/app/application_1499748738335_0012 Then click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1499748738335_0012_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:933)
        at org.apache.hadoop.util.Shell.run(Shell.java:844)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1123)
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:237)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)


In tracking page:

b/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.0.3-8/hadoop/lib/hadoop-lzo-0.6.0.2.6.0.3-8.jar:/etc/hadoop/conf/secure:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:$PWD/*"
export APP_SUBMIT_TIME_ENV="1499803041457"
export NM_HOST="sandbox.hortonworks.com"
export HADOOP_TOKEN_FILE_LOCATION="/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/container_1499748738335_0012_02_000001/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/yarn/"
export LOGNAME="yarn"
export JVM_PID="$$"
export PWD="/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/container_1499748738335_0012_02_000001"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_1499748738335_0012_02_000001"
export MALLOC_ARENA_MAX="4"
mkdir -p jobSubmitDir
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/10/job.splitmetainfo" "jobSubmitDir/job.splitmetainfo"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
mkdir -p jobSubmitDir
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/12/job.split" "jobSubmitDir/job.split"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/13/job.xml" "job.xml"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/13/mapreduce.tar.gz" "mr-framework"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/11/job.jar" "job.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" "/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/launch_container.sh"
chmod 640 "/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/launch_container.sh"
# Determining directory contents
echo "ls -l:" 1>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
ls -l 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
echo "find -L . -maxdepth 5 -ls:" 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
find -L . -maxdepth 5 -ls 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
find -L . -maxdepth 5 -type l -ls 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhdp.version=2.6.0.3-8 -Xmx1400m -Dhdp.version=2.6.0.3-8 org.apache.hadoop.mapreduce.v2.app.MRAppMaster 1>/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/stdout 2>/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/stderr "
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

          
          
            Log Type: stderr
          

            Log Upload Time: Tue Jul 11 19:57:27 +0000 2017
          

            Log Length: 2888
          
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001 (Is a directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.log4j.Logger.getLogger(Logger.java:104)
	at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:262)
	at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:108)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.commons.logging.impl.LogFactoryImpl.createLogFromClass(LogFactoryImpl.java:1025)
	at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:844)
	at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
	at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
	at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
	at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
	at org.apache.hadoop.service.AbstractService.<clinit>(AbstractService.java:43)
log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

          

I've checked /hadoop/yarn/local/usercache/yarn/appcache and /hadoop/yarn/log it is empty

3 REPLIES 3

avatar
Contributor

Any hints ?

avatar
Expert Contributor

Your log4j settings are incorrect and that's what's throwing the error in your log.

Check this property in your log4j settings specifically the one that looks like (below)... Sounds like it's trying to right to the container folder and this is causing an issue. (well atleast it's causing the error in your log)

log4j.appender.[category].File

Hope this helps!

avatar
Expert Contributor

Did you end up resolving this?