Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

After upgrading to CDH 5.7.1 spark throws ClassNotFoundException

After upgrading to CDH 5.7.1 spark throws ClassNotFoundException

New Contributor


I ran spark ETL using oozie action on CDH 5.6.0 and it worked well.

After upgrading to CDH 5.7.1 the same jar using the same workflow fails to run due to the following error:

“User class threw exception: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found”


 I have also upgraded the versions in pom.xml to the versions listed in page “Using the CDH 5 Maven Repository” and it didn’t help I still get the same error.

In the pom.xml used to build the ETL jar file we are using maven plugin called maven-shade-plugin and the hbase-server dependency is not provided so the jar contain the class TableOutputFormat (I’m a ware this is not the recommended way). 

In addition I copied from oozie stdout log the spark submit parameters and created an .sh file that calls spark-submit using the same parameters and it is working ok.

Weird both are using spark-submit! 


Any idea why running spark using oozie throws the ClassNotFoundException?



Warm Regards,

Meny Kobel