Support Questions
Find answers, ask questions, and share your expertise

Error: Unable to find org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory when running Spark job against HBase

I am running a Spark job in HDP 2.5 against an HBase table, but am getting the following error (below). I've tried a few different ways to include the ServerRpcControllerFactory library in my pom and I also tried to move jars around, but with no luck. Does anyone have suggestions for including the ServerRpcControllerFactory as part of my project? Thanks!

Exception in thread "main" java.io.IOException: java.lang.reflect.InvocationTargetException
	at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
	at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:420)
	at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:413)
	at org.apache.hadoop.hbase.client.ConnectionManager.getConnectionInternal(ConnectionManager.java:291)
	at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:184)
	at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:150)
	at com.github.zaratsian.SparkHBase.SparkHBaseBulkLoad$.main(SparkHBaseBulkLoad.scala:80)
	at com.github.zaratsian.SparkHBase.SparkHBaseBulkLoad.main(SparkHBaseBulkLoad.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
	at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
	... 16 more
Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
	at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
	at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2242)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:690)
	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
	... 21 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:264)
	at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
1 ACCEPTED SOLUTION

Accepted Solutions

Re: Error: Unable to find org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory when running Spark job against HBase

Despite the full name, ServerRpcControllerFactory is actually a class from Apache Phoenix, not HBase. Try including /usr/hdp/current/phoenix-client/phoenix-client.jar on your classpath.

View solution in original post

2 REPLIES 2

Re: Error: Unable to find org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory when running Spark job against HBase

Despite the full name, ServerRpcControllerFactory is actually a class from Apache Phoenix, not HBase. Try including /usr/hdp/current/phoenix-client/phoenix-client.jar on your classpath.

View solution in original post

Re: Error: Unable to find org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory when running Spark job against HBase

That worked great, thanks @Josh Elser! I was not looking at the correct jar. Just as a reference to others, here's the Spark command that I got to work correctly (note: there are multiple ways to add the jar to the classpath):

spark-submit --class com.github.zaratsian.SparkHBase.SparkHBaseBulkLoad --jars /tmp/SparkHBaseExample-0.0.1-SNAPSHOT.jar /usr/hdp/current/phoenix-client/phoenix-client.jar /tmp/props