Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Caused by: java.lang.NoClassDefFoundError: org/apache/htrace/Trace

avatar
Explorer

Despite adding the following, 

--conf spark.driver.extraClassPath=/opt/cloudera/parcels/CDH/jars/htrace-core-3.2.0-incubating.jar:/opt/cloudera/parcels/CDH/lib/hive/conf:/opt/cloudera/parcels/CDH/lib/hive/lib/*.jar \
--conf spark.executor.extraClassPath=/opt/cloudera/parcels/CDH/jars/htrace-core-3.2.0-incubating.jar:/opt/cloudera/parcels/CDH/lib/hive/conf:/opt/cloudera/parcels/CDH/lib/hive/lib/*.jar \

 

I get the following exception:

Caused by: java.lang.NoClassDefFoundError: org/apache/htrace/Trace

 

I have checked the file permission and directory permission and they have the right permission (directories have and read and execute perms while the files have 755 perms).

 

Please let me know what needs to be changed.

 

1 ACCEPTED SOLUTION

avatar
Explorer

After making a small change to the location of the jar, we got it working. The steps are as follows:

added the hbase jars to the executor classpath via the following setting:
signed in to ClouderaManager
went to the Spark on YARN service
went to the Configuration tab
typed defaults in the search box
selected gateway in the scope
added the entry:
spark.executor.extraClassPath=/hdfs03/parcels/CDH/lib/hbase/lib/htrace-core-3.2.0-incubating.jar

 

Note: we had to use the hdfs directory path.

View solution in original post

5 REPLIES 5

avatar
Cloudera Employee

Hi Nandyal,

 

From the Error it seems there was an jar file missing. Could you please share the entire stacktrace for further analysis on this issue?

 

Thanks

AK

avatar
Explorer

Hey AK,

   Following is the stack trace:

10:13:28,194 WARN [TaskSetManager] Lost task 8.0 in stage 1.0 (TID 4, hostname.domain.com, executor 12): java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
at org.apache.hadoop.hbase.spark.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:121)
at org.apache.hadoop.hbase.spark.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:121)
at org.apache.hadoop.hbase.spark.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:114)
at org.apache.hadoop.hbase.spark.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:113)
at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189)
at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91)
at org.apache.hadoop.hbase.spark.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:113)
at org.apache.hadoop.hbase.spark.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:121)
at org.apache.hadoop.hbase.spark.HBaseContext.org$apache$hadoop$hbase$spark$HBaseContext$$hbaseForeachPartition(HBaseContext.scala:459)
at org.apache.hadoop.hbase.spark.HBaseContext$$anonfun$bulkPut$1.apply(HBaseContext.scala:217)
at org.apache.hadoop.hbase.spark.HBaseContext$$anonfun$bulkPut$1.apply(HBaseContext.scala:217)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1888)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1888)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:242)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 23 more
Caused by: java.lang.NoClassDefFoundError: org/apache/htrace/Trace
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:216)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:419)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:935)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:659)
... 28 more
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.Trace
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 34 more

 

avatar
Explorer

And the Trace class is present in htrace-core-3.2.0-incubating.jar, which has been set in 

spark.driver.extraclasspath and spark.executor.extraClassPath config variables at the time of spark-submit.

 

Thank you!

Nandyal

avatar
Explorer
I tried doing this as well, with no success:
added the hbase jars to the executor classpath via the following setting:
signed in to ClouderaManager
went to the Spark on YARN service
went to the Configuration tab
typed defaults in the search box
selected gateway in the scope
added the entry:
spark.executor.extraClassPath=/opt/cloudera/parcels/CDH/lib/hbase/lib/htrace-core-3.2.0-incubating.jar

avatar
Explorer

After making a small change to the location of the jar, we got it working. The steps are as follows:

added the hbase jars to the executor classpath via the following setting:
signed in to ClouderaManager
went to the Spark on YARN service
went to the Configuration tab
typed defaults in the search box
selected gateway in the scope
added the entry:
spark.executor.extraClassPath=/hdfs03/parcels/CDH/lib/hbase/lib/htrace-core-3.2.0-incubating.jar

 

Note: we had to use the hdfs directory path.