Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Got spark class not found error trying to run cobertura

avatar
Explorer

2017-03-24 02:02:36,756|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|scala> import org.apache.spark.SparkContext; import org.apache.spark.sql._; impo ^Mrt org.apache.phoenix.spark._; val df = sqlContext.load("org.apache.phoenix.spar ^Mk", Map("table" -> "INPUT_TABLE", "zkUrl" -> "MYHOSTNAME:2181:/hbase-unsecure")); df.save("org.apache.phoenix.spark", SaveM ^Mode.Overwrite, Map("table" -> "OUTPUT_TABLE", "zkUrl" -> "MYHOSTNAME:2181:/hbase-unsecure")) 2017-03-24 02:02:38,054|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|warning: there were 2 deprecation warning(s); re-run with -deprecation for details 2017-03-24 02:02:38,256|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|java.lang.NoClassDefFoundError: net/sourceforge/cobertura/coveragedata/TouchCollector 2017-03-24 02:02:38,257|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at org.apache.phoenix.spark.DefaultSource.__cobertura_init(DefaultSource.scala) 2017-03-24 02:02:38,257|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at org.apache.phoenix.spark.DefaultSource.<clinit>(DefaultSource.scala) 2017-03-24 02:02:38,257|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2017-03-24 02:02:38,257|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2017-03-24 02:02:38,257|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2017-03-24 02:02:38,257|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2017-03-24 02:02:38,257|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at java.lang.Class.newInstance(Class.java:442) 2017-03-24 02:02:38,258|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:152) 2017-03-24 02:02:38,258|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119) 2017-03-24 02:02:38,258|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1153) 2017-03-24 02:02:38,258|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:25) 2017-03-24 02:02:38,258|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:36) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:38) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC$iwC$iwC$iwC$iwC.<init>(<console>:40) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC$iwC$iwC$iwC.<init>(<console>:42) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC$iwC$iwC.<init>(<console>:44) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC$iwC.<init>(<console>:46) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $iwC.<init>(<console>:48) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at <init>(<console>:50) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at .<init>(<console>:54) 2017-03-24 02:02:38,259|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at .<clinit>(<console>) 2017-03-24 02:02:38,260|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at .<init>(<console>:7) 2017-03-24 02:02:38,260|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at .<clinit>(<console>) 2017-03-24 02:02:38,260|INFO|MainThread|machine.py:154 - run()||GUID=403b8c97-9eaa-43f9-8303-d88c35ea2a2e|at $print(<console>)

1 ACCEPTED SOLUTION

avatar
Guru

@Cheng Xu, can you please try adding cobertura jar to spark.driver.extraClassPath?

View solution in original post

2 REPLIES 2

avatar
Guru

@Cheng Xu, can you please try adding cobertura jar to spark.driver.extraClassPath?

avatar
Contributor
@Cheng Xu

Are you trying to instrument spark jars ? In that case spark is primarily written in scala which is not supported in by cobertura. @yvora Please confirm.