Reply
New Contributor
Posts: 2
Registered: ‎02-20-2018

Newer Quickstart VMs throws error when running Hadoop jobs

[ Edited ]

I'm having a problem running Hadoop jobs with newer versions of the Quickstart VM. It used to be that I'd download a new version of the VM, start it up, then fire up Eclipse and run my job.  However, that is no longer working.  When I attempt to run any job I'm getting the following error:

 

java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1277)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1273)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1272)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1301)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at com.mmm.arc.machinelearning.common.job.TestClass.run(TestClass.java:47)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at com.mmm.arc.machinelearning.common.job.TestClass.main(TestClass.java:70)

 

 

I'm seeing this in versions 5.8 and 5.12, but not in 5.5. I was able to create a simple do-nothing map reduce job that recreates the issue and can provide if needed. 

 

Please let me know if anyone has an idea!

 

New Contributor
Posts: 2
Registered: ‎02-20-2018

Re: Newer Quickstart VMs throws error when running Hadoop jobs

Anyone?

Announcements