Reply
Highlighted
Explorer
Posts: 14
Registered: ‎02-10-2015

Mapreduce job stucking

Hi,

 

Today I have configured a fresh cluster with 4 nodes with CM & CDH 5.15.1, all the values are there by default. I am just trying to run a simple wordcount job and Pi job but both are stucking. I have never faced this issue before. Can anyone please tell me what's wrong here?

 

After stucking for sometime it says "Job failed with state KILLED due to: Application killed by user" even if I am not killing any job manually.

 

[hdfs@cmhost root]$ hadoop jar /opt/cloudera/parcels/CDH/jars/hadoop-examples.jar teragen 1000 /user/pankaj_test
18/10/18 11:33:39 INFO terasort.TeraGen: Generating 1000 using 2
18/10/18 11:33:41 INFO mapreduce.JobSubmitter: number of splits:2
18/10/18 11:33:42 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1539861671552_0103
18/10/18 11:33:43 INFO impl.YarnClientImpl: Submitted application application_1539861671552_0103
18/10/18 11:33:43 INFO mapreduce.Job: The url to track the job: http://cmhost.c.plenary-ability-195307.internal:8088/proxy/application_1539861671552_0103/
18/10/18 11:33:43 INFO mapreduce.Job: Running job: job_1539861671552_0103
18/10/18 11:38:39 INFO mapreduce.Job: Job job_1539861671552_0103 running in uber mode : false
18/10/18 11:38:39 INFO mapreduce.Job: map 0% reduce 0%
18/10/18 11:38:39 INFO mapreduce.Job: Job job_1539861671552_0103 failed with state KILLED due to: Application killed by user.
18/10/18 11:38:39 INFO mapreduce.Job: Counters: 0
You have new mail in /var/spool/mail/root

 

 

 

[hdfs@cmhost root]$ hadoop jar /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar pi 10 100
Number of Maps = 10
Samples per Map = 100
Wrote input for Map #0
Wrote input for Map #1
Wrote input for Map #2
Wrote input for Map #3
Wrote input for Map #4
Wrote input for Map #5
Wrote input for Map #6
Wrote input for Map #7
Wrote input for Map #8
Wrote input for Map #9
Starting Job
18/10/18 11:50:04 INFO input.FileInputFormat: Total input paths to process : 10
18/10/18 11:50:06 INFO mapreduce.JobSubmitter: number of splits:10
18/10/18 11:50:07 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1539861671552_0234
18/10/18 11:50:08 INFO impl.YarnClientImpl: Submitted application application_1539861671552_0234
18/10/18 11:50:08 INFO mapreduce.Job: The url to track the job: http://cmhost.c.plenary-ability-195307.internal:8088/proxy/application_1539861671552_0234/
18/10/18 11:50:08 INFO mapreduce.Job: Running job: job_1539861671552_0234
18/10/18 11:54:26 INFO mapreduce.Job: Job job_1539861671552_0234 running in uber mode : false
18/10/18 11:54:26 INFO mapreduce.Job: map 0% reduce 0%
18/10/18 11:54:26 INFO mapreduce.Job: Job job_1539861671552_0234 failed with state KILLED due to: Application killed by user.
18/10/18 11:54:26 INFO mapreduce.Job: Counters: 0
Job Finished in 264.638 seconds
java.io.FileNotFoundException: File does not exist: hdfs://nameservice1/user/hdfs/QuasiMonteCarlo_1539863391850_1199537865/out/reduce-out
at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1270)
at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1262)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1262)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1820)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1844)
at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314)
at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:354)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:363)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:226)
at org.apache.hadoop.util.RunJar.main(RunJar.java:141)
You have new mail in /var/spool/mail/root

Announcements