Support Questions

Find answers, ask questions, and share your expertise

ERROR util.ProcfsBasedProcessTree: java.io.IOException: in MapReduce program

avatar
Explorer

I am trying to run mapreduce program and getting following errors

ERROR util.ProcfsBasedProcessTree: java.io.IOException: Cannot run program "getconf": error=2, No such file or directory
   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
   at org.apache.hadoop.util.Shell.runCommand(Shell.java:200)
   at org.apache.hadoop.util.Shell.run(Shell.java:182)
   at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
   at org.apache.hadoop.util.ProcfsBasedProcessTree.<clinit>(ProcfsBasedProcessTree.java:61)
   at org.apache.hadoop.util.LinuxResourceCalculatorPlugin.<init>(LinuxResourceCalculatorPlugin.java:106)
   at org.apache.hadoop.util.ResourceCalculatorPlugin.getResourceCalculatorPlugin(ResourceCalculatorPlugin.java:149)
   at org.apache.hadoop.mapred.Task.initialize(Task.java:532)
   at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
   at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:223)
   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
   at java.util.concurrent.FutureTask.run(FutureTask.java:262)
   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
   at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: error=2, No such file or directory
   at java.lang.UNIXProcess.forkAndExec(Native Method)
   at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
   at java.lang.ProcessImpl.start(ProcessImpl.java:130)
   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
   ... 14 more
16/05/28 19:21:50 ERROR util.ProcfsBasedProcessTree: java.io.IOException: Cannot run program "getconf": error=2, No such file or directory
   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
   at org.apache.hadoop.util.Shell.runCommand(Shell.java:200)
   at org.apache.hadoop.util.Shell.run(Shell.java:182)
   at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
   at org.apache.hadoop.util.ProcfsBasedProcessTree.<clinit>(ProcfsBasedProcessTree.java:75)
   at org.apache.hadoop.util.LinuxResourceCalculatorPlugin.<init>(LinuxResourceCalculatorPlugin.java:106)
   at org.apache.hadoop.util.ResourceCalculatorPlugin.getResourceCalculatorPlugin(ResourceCalculatorPlugin.java:149)
   at org.apache.hadoop.mapred.Task.initialize(Task.java:532)
   at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
   at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:223)
   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
   at java.util.concurrent.FutureTask.run(FutureTask.java:262)
   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
   at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: error=2, No such file or directory
   at java.lang.UNIXProcess.forkAndExec(Native Method)
   at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
   at java.lang.ProcessImpl.start(ProcessImpl.java:130)
   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
   ... 14 more
16/05/28 19:21:50 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4036afc5
16/05/28 19:21:50 INFO mapred.MapTask: Processing split: file:/home/hadoop/Documents/sampledata:0+344
16/05/28 19:21:50 INFO mapred.MapTask: numReduceTasks: 1
16/05/28 19:21:50 INFO mapred.MapTask: io.sort.mb = 100
16/05/28 19:21:50 INFO mapred.MapTask: data buffer = 79691776/99614720
16/05/28 19:21:50 INFO mapred.MapTask: record buffer = 262144/327680
16/05/28 19:21:50 INFO mapred.LocalJobRunner: Map task executor complete.
16/05/28 19:21:50 WARN mapred.LocalJobRunner: job_local884096935_0001
java.lang.Exception: java.lang.NumberFormatException: null
   at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:354)
Caused by: java.lang.NumberFormatException: null
   at java.lang.Integer.parseInt(Integer.java:454)
   at java.lang.Integer.parseInt(Integer.java:527)
   at hadoop.ProcessUnits$E_EMapper.map(ProcessUnits.java:46)
   at hadoop.ProcessUnits$E_EMapper.map(ProcessUnits.java:1)
   at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
   at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
   at org.apache.hadoop.mapred.MapTask.run(MapTask.java:366)
   at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:223)
   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
   at java.util.concurrent.FutureTask.run(FutureTask.java:262)
   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
   at java.lang.Thread.run(Thread.java:745)
Exception in thread "Thread-1" java.lang.NoClassDefFoundError: org/apache/commons/httpclient/HttpMethod
   at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:437)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.httpclient.HttpMethod
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   ... 1 more
16/05/28 19:21:51 INFO mapred.JobClient:  map 0% reduce 0%
16/05/28 19:21:51 INFO mapred.JobClient: Job complete: job_local884096935_0001
16/05/28 19:21:51 INFO mapred.JobClient: Counters: 0
16/05/28 19:21:51 INFO mapred.JobClient: Job Failed: NA
Exception in thread "main" java.io.IOException: Job failed!
   at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
   at hadoop.ProcessUnits.main(ProcessUnits.java:95)

Any idea? Thanks!!

1 ACCEPTED SOLUTION

avatar
Super Guru
@atul kumar

Are you running Mapreduce in local mode?

Seems like either you don't have commons-httpclient-<version>.jar file in class path or probably multiple versions of same jar conflicting. Please cross check.

View solution in original post

8 REPLIES 8

avatar
Super Guru
@atul kumar

Are you running Mapreduce in local mode?

Seems like either you don't have commons-httpclient-<version>.jar file in class path or probably multiple versions of same jar conflicting. Please cross check.

avatar
Explorer

commons-httpclient is already added with only one version. Can you guide me how to run Mapreduce in HDFS mode? I am using Java in eclipse.

avatar
Super Guru

How are you running this map reduce job? can you share the command or the scenario.?

avatar
Super Guru

Hi @atul kumar,

Looks like jar wasn't imported into the project while creating, can you please follow below doc and see if that resolve your issue?

https://acadgild.com/blog/running-mapreduce-in-local-mode-2/

avatar
Explorer

Thanks!

How can I know that if it is running in local mode or HDFS mode?

avatar
Super Guru
@atul kumar

If you run the job it will give you job_id. If the job_id is something like job_<12312324233242> then assume that oyur job is running in HDFS mode. If the job_id is job_local** then its local mode.

avatar
Explorer

Yes it is in local mode. My job Id is: job_local295133331_0001. What should I do to resolve this issue?

avatar
Explorer

Thanks

@Jitendra Yadav for addressing this issue. I created a new project and imported all jars from Hadoop/common/lib directory. Earlier I have just set the path to hadoop directory jars which was not working. This issue seems resolved for now.