Support Questions

Find answers, ask questions, and share your expertise

Spark 1.4.1 and HBase 1.1.2 Integration

avatar
Guru

I am running the following:

$ run-example HBase-table <tablename>

But I keep getting the following error message:

16/01/11 20:36:00 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
	at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:31)
	at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	... 11 more

I have tried with adding the hbase-<version>-client.jar to both Hadoop & Spark classpath but to no avail.

1 ACCEPTED SOLUTION

avatar

Have you tried on HDP 2.3.4? Not sure if its related but there was an issues with phoenix-spark connector in previous minor versions of 2.3 versions due to PHOENIX-2040 being missed (here is the internal jira)

View solution in original post

6 REPLIES 6

avatar
Master Mentor
@Vedant Jain

You have to set hbase jar in Hadoop Class path.

avatar
Guru

Hi Neeraj, I did not mention it .. but I tried $HADOOP_CLASSPATH as well as $SPARK_CLASSPATH. It did not work.

avatar
Master Mentor
@Vedant Jain

Looking at this example...I am not sure if you did export hbase xml file

Link

avatar

Have you tried on HDP 2.3.4? Not sure if its related but there was an issues with phoenix-spark connector in previous minor versions of 2.3 versions due to PHOENIX-2040 being missed (here is the internal jira)

avatar

I think this should make it work:

export SPARK_CLASSPATH=/usr/lib/hbase/lib/hbaseprotocol.jar:/etc/hbase/conf:$(hbase classpath) or you add the HBase dependency with this: --driver-class-path hbase-dependency at the spark-submit.

avatar
Master Mentor

@Vedant Jain can you accept the best answer to close this thread?