Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Win7 submit mapreduce with problem Stack trace: ExitCodeException exitCode=1: /bin/bash: line 0: fg: no job control

avatar
Explorer

I use win7 submit mapreduce job(hdp-2.3.4.7-4) ,hdp has been installed ,and I use eclipse (import all necessary jars) and there is a problem

2016-11-14 16:47:56,047 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2016-11-14 16:47:57,389 WARN [main] shortcircuit.DomainSocketFactory (DomainSocketFactory.java:<init>(117)) - The short-circuit local reads feature cannot be used because UNIX Domain sockets are not available on Windows. 2016-11-14 16:47:58,763 INFO [main] impl.TimelineClientImpl (TimelineClientImpl.java:serviceInit(352)) - Timeline service address: http://master.bmsoft.com:8188/ws/v1/timeline/ 2016-11-14 16:47:59,029 INFO [main] client.RMProxy (RMProxy.java:createRMProxy(98)) - Connecting to ResourceManager at master.bmsoft.com/10.10.10.36:8050 2016-11-14 16:47:59,996 WARN [main] mapreduce.JobResourceUploader (JobResourceUploader.java:uploadFiles(64)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 2016-11-14 16:48:00,043 WARN [main] mapreduce.JobResourceUploader (JobResourceUploader.java:uploadFiles(171)) - No job jar file set. User classes may not be found. See Job or Job#setJar(String). 2016-11-14 16:48:00,091 INFO [main] input.FileInputFormat (FileInputFormat.java:listStatus(283)) - Total input paths to process : 1 2016-11-14 16:48:00,512 INFO [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(198)) - number of splits:1 2016-11-14 16:48:00,871 INFO [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(287)) - Submitting tokens for job: job_1479003632635_0030 2016-11-14 16:48:01,074 INFO [main] mapred.YARNRunner (YARNRunner.java:createApplicationSubmissionContext(371)) - Job jar is not present. Not adding any jar to the list of resources. 2016-11-14 16:48:01,402 INFO [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(274)) - Submitted application application_1479003632635_0030 2016-11-14 16:48:01,449 INFO [main] mapreduce.Job (Job.java:submit(1294)) - The url to track the job: http://master.bmsoft.com:8088/proxy/application_1479003632635_0030/ 2016-11-14 16:48:01,449 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1339)) - Running job: job_1479003632635_0030 2016-11-14 16:48:04,507 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1360)) - Job job_1479003632635_0030 running in uber mode : false 2016-11-14 16:48:04,507 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) - map 0% reduce 0% 2016-11-14 16:48:04,539 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1380)) - Job job_1479003632635_0030 failed with state FAILED due to: Application application_1479003632635_0030 failed 2 times due to AM Container for appattempt_1479003632635_0030_000002 exited with exitCode: 1 For more detailed output, check application tracking page:http://master.bmsoft.com:8088/cluster/app/application_1479003632635_0030Then, click on links to logs of each attempt. Diagnostics: Exception from container-launch. Container id: container_e28_1479003632635_0030_02_000001 Exit code: 1 Exception message: /bin/bash: line 0: fg: no job control Stack trace: ExitCodeException exitCode=1: /bin/bash: line 0: fg: no job control at org.apache.hadoop.util.Shell.runCommand(Shell.java:576) at org.apache.hadoop.util.Shell.run(Shell.java:487) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Container exited with a non-zero exit code 1 Failing this attempt. Failing the application. 2016-11-14 16:48:04,570 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1385)) - Counters: 0

1 ACCEPTED SOLUTION

avatar
Super Guru

@Mon key

In HDFS, reads normally go through the DataNode. Thus, when the client asks the DataNode to read a file, the DataNode reads that file off of the disk and sends the data to the client over a TCP socket. So-called “short-circuit” reads bypass the DataNode, allowing the client to read the file directly. Obviously, this is only possible in cases where the client is co-located with the data. Short-circuit reads provide a substantial performance boost to many applications.

To configure short-circuit local reads, you must enable libhadoop.so. See Native Libraries for details on enabling this library. Windows is not a supported OS. You need to turn off this feature and re-execute your job.

View solution in original post

2 REPLIES 2

avatar
Super Guru

@Mon key could please try to submit the job again after setting the following property in you configuration?

conf.set("mapreduce.app-submission.cross-platform", "true");

avatar
Super Guru

@Mon key

In HDFS, reads normally go through the DataNode. Thus, when the client asks the DataNode to read a file, the DataNode reads that file off of the disk and sends the data to the client over a TCP socket. So-called “short-circuit” reads bypass the DataNode, allowing the client to read the file directly. Obviously, this is only possible in cases where the client is co-located with the data. Short-circuit reads provide a substantial performance boost to many applications.

To configure short-circuit local reads, you must enable libhadoop.so. See Native Libraries for details on enabling this library. Windows is not a supported OS. You need to turn off this feature and re-execute your job.