Support Questions

Find answers, ask questions, and share your expertise

Who agreed with this solution

avatar
Contributor

Ok, well after I restarted HDFS via:

for x in `cd /etc/init.d ; ls hadoop-hdfs-*` ; do sudo service $x restart ; done

The proxy settings that I added to core-site.xml appeared to have kicked in and I could then run the example.

However, when I go to MY-VM-ALIAS:11000/oozie, the job's status is KILLED immediately.

 

If I double click on the job,  and then double click on the action item with the name of fail, I can see that the error message is:

 

Map/Reduce failed, error message[rRuntimeException: Error in configuing object]

 

Clicking on the Job Log tab, I saw this:

 

Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
	at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:134)
	at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:174)
	at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:38)
	... 29 more
Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1680)
	at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:127)
	... 31 more

 Google suggested that I put the "hadoop-lzo.jar in /var/lib/oozie/ and [restart] Oozie."

 

So I issued (on my master node with the Oozie server):

 

find / -name hadoop-lzo.jar
cp /usr/lib/hadoop/lib/hadoop-lzo.jar /var/lib/oozie/
sudo service oozie restart

 and my job ran and succeeded!

View solution in original post

Who agreed with this solution