Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

error mapreduce execution example

error mapreduce execution example

Super Collaborator

Hi:

i cant run mapreduce example in my cluster , here the error:

sudo -u hdfs hadoop jar hadoop-mapreduce-examples.jar pi 10 10
16/07/15 10:08:36 INFO mapreduce.Job: Task Id : attempt_1468563041806_0005_m_000004_1, Status : FAILED
Application application_1468563041806_0005 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is nobody
main : requested yarn user is hdfs
Can't create directory /hadoop/yarn/local/usercache/hdfs/appcache/application_1468563041806_0005 - Permission denied
Did not create any app directories




16/07/15 10:08:36 INFO mapreduce.Job: Task Id : attempt_1468563041806_0005_m_000005_1, Status : FAILED
Application application_1468563041806_0005 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is nobody
main : requested yarn user is hdfs
Can't create directory /hadoop/yarn/local/usercache/hdfs/appcache/application_1468563041806_0005 - Permission denied
Did not create any app directories




16/07/15 10:08:38 INFO mapreduce.Job:  map 20% reduce 0%
16/07/15 10:08:39 INFO mapreduce.Job: Task Id : attempt_1468563041806_0005_m_000000_2, Status : FAILED
Application application_1468563041806_0005 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is nobody
main : requested yarn user is hdfs
Can't create directory /hadoop/yarn/local/usercache/hdfs/appcache/application_1468563041806_0005 - Permission denied
Did not create any app directories




16/07/15 10:08:39 INFO mapreduce.Job: Task Id : attempt_1468563041806_0005_m_000001_2, Status : FAILED
Application application_1468563041806_0005 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is nobody
main : requested yarn user is hdfs
Can't create directory /hadoop/yarn/local/usercache/hdfs/appcache/application_1468563041806_0005 - Permission denied
Did not create any app directories




16/07/15 10:08:41 INFO mapreduce.Job: Task Id : attempt_1468563041806_0005_m_000004_2, Status : FAILED
Application application_1468563041806_0005 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is nobody
main : requested yarn user is hdfs
Can't create directory /hadoop/yarn/local/usercache/hdfs/appcache/application_1468563041806_0005 - Permission denied
Did not create any app directories




16/07/15 10:08:41 INFO mapreduce.Job: Task Id : attempt_1468563041806_0005_m_000005_2, Status : FAILED
Application application_1468563041806_0005 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is nobody
main : requested yarn user is hdfs
Can't create directory /hadoop/yarn/local/usercache/hdfs/appcache/application_1468563041806_0005 - Permission denied
Did not create any app directories




16/07/15 10:08:42 INFO mapreduce.Job:  map 30% reduce 0%
16/07/15 10:08:44 INFO mapreduce.Job:  map 100% reduce 100%
16/07/15 10:08:44 INFO mapreduce.Job: Job job_1468563041806_0005 failed with state FAILED due to: Task failed task_1468563041806_0005_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0


16/07/15 10:08:45 INFO mapreduce.Job: Counters: 39
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=404010
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=780
                HDFS: Number of bytes written=0
                HDFS: Number of read operations=12
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=0
        Job Counters
                Failed map tasks=14
                Killed map tasks=5
                Killed reduce tasks=1
                Launched map tasks=21
                Other local map tasks=12
                Data-local map tasks=9
                Total time spent by all maps in occupied slots (ms)=147342
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=49114
                Total time spent by all reduce tasks (ms)=0
                Total vcore-seconds taken by all map tasks=49114
                Total vcore-seconds taken by all reduce tasks=0
                Total megabyte-seconds taken by all map tasks=21119020
                Total megabyte-seconds taken by all reduce tasks=0
        Map-Reduce Framework
                Map input records=3
                Map output records=6
                Map output bytes=54
                Map output materialized bytes=84
                Input split bytes=426
                Combine input records=0
                Spilled Records=6
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=179
                CPU time spent (ms)=1350
                Physical memory (bytes) snapshot=1175269376
                Virtual memory (bytes) snapshot=6783180800
                Total committed heap usage (bytes)=903348224
        File Input Format Counters
                Bytes Read=354
Job Finished in 35.569 seconds
java.io.FileNotFoundException: File does not exist: hdfs://NameNodeHA/user/hdfs/QuasiMonteCarlo_1468570081032_1937629430/out/reduce-out
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1319)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1311)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1311)
        at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1752)
        at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1776)
        at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314)
        at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:354)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:363)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
        at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
1 REPLY 1
Highlighted

Re: error mapreduce execution example

Super Collaborator

Could you please post the value of yarn.nodemanager.local-dirs from yarn-site.xml. Also try giving write permission to directory /hadoop/yarn

Don't have an account?
Coming from Hortonworks? Activate your account here