Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

oozie hive workflow instances (mapreduce tasks) failing with class com.hadoop.compression.lzo.LzoCodec not found

avatar

I have a hive workflow which inserts into table2 by selecting from table1. Table2 is in orc format. Every instance of the hive workflow fails with Class com.hadoop.compression.lzo.LzoCodec not found error. I'm able to run the same insert command directly in hive command prompt and it works fine (with both tez and mapr as execution engines). Not able to figure out why it is failing when running with oozie. I tried various steps like using ZLIB and SNAPPY as compression formats, copied the lzo jar files to /var/lib/oozie and restarted oozie, included the paths to these jars in classpath and librarypath. But still getting the same error. Any help is greatly appreciated..

1 ACCEPTED SOLUTION

avatar

It worked for me after I modified HDFS core-site.xml and removed com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec from io.compression.codecs parameter. I also removed io.compression.codec.lzo.class (which was set to com.hadoop.compression.lzo.LzoCodec). Restarted mapreduce, yarn, hive and oozie.

View solution in original post

5 REPLIES 5

avatar
Super Collaborator
@Aditya Telidevara

Are lzo libraries installed on all nodes?

Execute the following command at all the nodes in your cluster: RHEL/CentOS/Oracle Linux:

yum install lzo lzo-devel hadooplzo hadooplzo-native

avatar

yes they are installed on all the nodes. rpm -ql hadooplzo_2_5_0_0_1245 and rpm -ql hadooplzo2_5_0_0_1245-native gives right output on all the nodes.

avatar

@Kuldeep Kulkarni

Can you help?

I also have copied hadoop-lzo-0.6.0.2.5.0.0.-1245.jar to /usr/hdp/2.5.0.0-1245/oozie/share/lib/hive on oozie master. I also have them in /user/oozie/share/lib/hive in hdfs. I do see that this jar file is copied in the mapreduce task logs. But still getting the LzoCodec not found error.

avatar

It worked for me after I modified HDFS core-site.xml and removed com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec from io.compression.codecs parameter. I also removed io.compression.codec.lzo.class (which was set to com.hadoop.compression.lzo.LzoCodec). Restarted mapreduce, yarn, hive and oozie.

avatar

Can any of the experts here know what is the root cause of this issue? What I did was a work around to get things rolling but doesn't look like a meaningful solution.