Member since
06-30-2017
10
Posts
2
Kudos Received
0
Solutions
07-14-2017
04:35 AM
Any hints ?
... View more
07-11-2017
08:16 PM
Sandbox 2.6, trying to execute mr job from user yarn, the job failed: [yarn@sandbox work]$ yarn jar testmr-1.0-SNAPSHOT.jar ParquetJob /dir1 outdir
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/avro-tools-1.8.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
17/07/11 19:57:20 INFO impl.TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
17/07/11 19:57:20 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8032
17/07/11 19:57:20 INFO input.FileInputFormat: Total input paths to process : 52
17/07/11 19:57:21 INFO mapreduce.JobSubmitter: number of splits:52
17/07/11 19:57:21 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1499748738335_0012
17/07/11 19:57:21 INFO impl.YarnClientImpl: Submitted application application_1499748738335_0012
17/07/11 19:57:21 INFO mapreduce.Job: The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1499748738335_0012/
17/07/11 19:57:21 INFO mapreduce.Job: Running job: job_1499748738335_0012
17/07/11 19:57:26 INFO mapreduce.Job: Job job_1499748738335_0012 running in uber mode : false
17/07/11 19:57:26 INFO mapreduce.Job: map 0% reduce 0%
17/07/11 19:57:26 INFO mapreduce.Job: Job job_1499748738335_0012 failed with state FAILED due to: Application application_1499748738335_0012 failed 2 times due to AM Container for appattempt_1499748738335_0012_000002 exited with exitCode: 1
For more detailed output, check the application tracking page: http://sandbox.hortonworks.com:8088/cluster/app/application_1499748738335_0012 Then click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1499748738335_0012_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:933)
at org.apache.hadoop.util.Shell.run(Shell.java:844)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1123)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:237)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
In tracking page: b/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.0.3-8/hadoop/lib/hadoop-lzo-0.6.0.2.6.0.3-8.jar:/etc/hadoop/conf/secure:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:$PWD/*"
export APP_SUBMIT_TIME_ENV="1499803041457"
export NM_HOST="sandbox.hortonworks.com"
export HADOOP_TOKEN_FILE_LOCATION="/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/container_1499748738335_0012_02_000001/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/yarn/"
export LOGNAME="yarn"
export JVM_PID="$$"
export PWD="/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/container_1499748738335_0012_02_000001"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_1499748738335_0012_02_000001"
export MALLOC_ARENA_MAX="4"
mkdir -p jobSubmitDir
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/10/job.splitmetainfo" "jobSubmitDir/job.splitmetainfo"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
mkdir -p jobSubmitDir
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/12/job.split" "jobSubmitDir/job.split"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/13/job.xml" "job.xml"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/13/mapreduce.tar.gz" "mr-framework"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/yarn/appcache/application_1499748738335_0012/filecache/11/job.jar" "job.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" "/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/launch_container.sh"
chmod 640 "/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/launch_container.sh"
# Determining directory contents
echo "ls -l:" 1>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
ls -l 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
echo "find -L . -maxdepth 5 -ls:" 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
find -L . -maxdepth 5 -ls 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
find -L . -maxdepth 5 -type l -ls 1>>"/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhdp.version=2.6.0.3-8 -Xmx1400m -Dhdp.version=2.6.0.3-8 org.apache.hadoop.mapreduce.v2.app.MRAppMaster 1>/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/stdout 2>/hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001/stderr "
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
Log Type: stderr
Log Upload Time: Tue Jul 11 19:57:27 +0000 2017
Log Length: 2888
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /hadoop/yarn/log/application_1499748738335_0012/container_1499748738335_0012_02_000001 (Is a directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.log4j.Logger.getLogger(Logger.java:104)
at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:262)
at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:108)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.commons.logging.impl.LogFactoryImpl.createLogFromClass(LogFactoryImpl.java:1025)
at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:844)
at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
at org.apache.hadoop.service.AbstractService.<clinit>(AbstractService.java:43)
log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
I've checked /hadoop/yarn/local/usercache/yarn/appcache and /hadoop/yarn/log it is empty
... View more
Labels:
- Labels:
-
Apache Hadoop
07-10-2017
03:52 PM
I've recompiled sources with parquet.version 1.6 and copied parquet-hadoop-bundle-1.6.0.jar. The job is working now, thanks !
... View more
07-10-2017
03:26 PM
The jar copied, but I've got the same exception: [root@sandbox target]# export HADOOP_CLASSPATH=/usr/hdp/2.6.0.3-8/hadoop/lib
[root@sandbox target]# find /usr/hdp/2.6.0.3-8/hadoop/lib/ -name "parquet*.jar"
/usr/hdp/2.6.0.3-8/hadoop/lib/parquet-hadoop-bundle-1.8.1.jar
[root@sandbox target]# yarn jar testmr-1.0-SNAPSHOT.jar TestReadParquet /testdir/dir1 /testdir/out_file
Exception in thread "main" java.lang.NoClassDefFoundError: parquet/Log
at TestReadParquet.<clinit>(TestReadParquet.java:24)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.util.RunJar.run(RunJar.java:226)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.ClassNotFoundException: parquet.Log
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
... View more
07-09-2017
12:11 PM
I'm trying to execute test mr job on Sandbox 2.6 #export HADOOP_CLASSPATH=/usr/hdp/2.6.0.3-8/hadoop/lib # yarn jar testmr.jar TestReadParquet /testdir/dir1 out_fileException in thread "main" java.lang.NoClassDefFoundError: parquet/Log at TestReadParquet.<clinit>(TestReadParquet.java:24) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.hadoop.util.RunJar.run(RunJar.java:226) at org.apache.hadoop.util.RunJar.main(RunJar.java:148)Caused by: java.lang.ClassNotFoundException: parquet.Log at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 5 more Should I put parquet-common jars in HADOOP_CLASSPATH ? find /usr/hdp/2.6.0.3-8/hadoop/lib/ -name "parquet*.jar"
no files at this moment
... View more
Labels:
- Labels:
-
Apache Hadoop
07-03-2017
06:25 PM
Thanks for the answer, I found the reason. I've used web interface to upload file, looks like the interface tried to save in /tmp before moving to hdfs. The is a reason why I've got the error, no space left in linux filesystem. Second problem I forgot to execute hdfs dfsadmin -safemode leave
... View more
06-30-2017
08:50 PM
Hortonworks Sandbox 2.6, vmware. I've added /dev/sda4 partition with 61G (ext3), in Ambari added /usr01 folder in DataNode directories, hdfs is reporting 87.7G total now: [root@sandbox hdfs]# hdfs dfsadmin -report
Safe mode is ON
Configured Capacity: 94168273920 (87.70 GB)
Present Capacity: 79834880512 (74.35 GB)
DFS Remaining: 59022965248 (54.97 GB)
DFS Used: 20811915264 (19.38 GB)
DFS Used%: 26.07%
Under replicated blocks: 12
Blocks with corrupt replicas: 0
Missing blocks: 0
Missing blocks (with replication factor 1): 0
But when I try to upload 2G file, error is rising: Cannot create file ... Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. and in name node logs: [root@sandbox hdfs]# tail -n 20 hadoop-hdfs-namenode-sandbox.hortonworks.com.out
java.io.IOException: No space left on device
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:326)
at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:221)
at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:291)
at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:295)
at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:141)
at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:229)
at org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:59)
at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:324)
at org.apache.log4j.RollingFileAppender.subAppend(RollingFileAppender.java:276)
at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
at org.apache.log4j.Category.callAppenders(Category.java:206)
at org.apache.log4j.Category.forcedLog(Category.java:391)
at org.apache.log4j.Category.log(Category.java:856)
at org.apache.commons.logging.impl.Log4JLogger.info(Log4JLogger.java:176)
at org.apache.hadoop.ipc.Server.logException(Server.java:2428)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2362)
... View more
Labels:
- Labels:
-
Apache Hadoop