Created 01-11-2016 08:38 PM
I am running the following:
$ run-example HBase-table <tablename>
But I keep getting the following error message:
16/01/11 20:36:00 INFO BlockManagerMaster: Registered BlockManager Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:31) at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 11 more
I have tried with adding the hbase-<version>-client.jar to both Hadoop & Spark classpath but to no avail.
Created 01-12-2016 05:52 PM
Have you tried on HDP 2.3.4? Not sure if its related but there was an issues with phoenix-spark connector in previous minor versions of 2.3 versions due to PHOENIX-2040 being missed (here is the internal jira)
Created 01-12-2016 03:21 AM
You have to set hbase jar in Hadoop Class path.
Created 01-12-2016 03:39 AM
Hi Neeraj, I did not mention it .. but I tried $HADOOP_CLASSPATH as well as $SPARK_CLASSPATH. It did not work.
Created 01-12-2016 05:59 PM
Created 01-12-2016 05:52 PM
Have you tried on HDP 2.3.4? Not sure if its related but there was an issues with phoenix-spark connector in previous minor versions of 2.3 versions due to PHOENIX-2040 being missed (here is the internal jira)
Created 01-18-2016 01:14 PM
I think this should make it work:
export SPARK_CLASSPATH=/usr/lib/hbase/lib/hbaseprotocol.jar:/etc/hbase/conf:$(hbase classpath) or you add the HBase dependency with this: --driver-class-path hbase-dependency at the spark-submit.
Created 02-02-2016 02:13 AM
@Vedant Jain can you accept the best answer to close this thread?