Support Questions

Find answers, ask questions, and share your expertise

ERROR PHOENIX on SPARK

avatar
Master Collaborator

Hi:

I have HDP 2.4.0.0 and Iam trying to use Phoenix 2.4.0 on Spark 1.6.0:

I am using the next classpath but iam receiving this error, I need anything else?

  1. Traceback(most recent call last):
  2. File"<stdin>", line 5,in<module>
  3. File"/usr/hdp/2.4.0.0-169/spark/python/pyspark/sql/readwriter.py", line 385,in save
  4. self._jwrite.save()
  5. File"/usr/hdp/2.4.0.0-169/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 813,in __call__
  6. File"/usr/hdp/2.4.0.0-169/spark/python/pyspark/sql/utils.py", line 45,in deco
  7. return f(*a,**kw)
  8. File"/usr/hdp/2.4.0.0-169/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308,in get_return_value
  9. py4j.protocol.Py4JJavaError:An error occurred while calling o67.save.
  10. : java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/regionserver/ConstantSizeRegionSplitPolicy
  11. Causedby: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.regionserver.ConstantSizeRegionSplitPolicy
  1. Driverand executor path Iamusing:
  2. /usr/hdp/2.4.0.0-169/phoenix/lib/phoenix-spark-4.4.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/phoenix/lib/hbase-client.jar:/usr/hdp/2.4.0.0-169/phoenix/lib/hbase-common.jar:/usr/hdp/2.4.0.0-169/phoenix/lib/phoenix-core-4.4.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/phoenix/lib/hbase-protocol.jar

Regards

1 ACCEPTED SOLUTION

avatar

The class is from hbase-server.jar. You can add /usr/hdp/2.4.0.0-169/phoenix/lib/hbase-server.jar to classpath and try.

View solution in original post

2 REPLIES 2

avatar

The class is from hbase-server.jar. You can add /usr/hdp/2.4.0.0-169/phoenix/lib/hbase-server.jar to classpath and try.

avatar
Master Collaborator

Hi:

finally the jar that I have needed was /usr/hdp/current/hbase-client/lib/hbase-server.jar.

Regards