Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

When running Spark job getting Error: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCacheBlocks(Z)V

avatar
New Member

I'am Running Spark Job that does hbase scan. However I get an error java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCacheBlocks(Z)V

As I looked it up, it is caused by version mismatch between the hbase-client.jar to hbase version. However I used only hdp compiled jars.

My HDP version is 2.4.3.0

I run the sumbit the flowwing way:

export HADOOP_CONF_DIR=/etc/hadoop/conf/

export SPARK_CONF_DIR=/etc/spark/conf

/usr/hdp/current/spark-client/bin/spark-submit

--class MyClass

--master yarn-cluster

--num-executors 4

--driver-memory 1g

--executor-memory 4g

--executor-cores 6

--conf spark.driver.cores=6

--conf spark.storage.memoryFraction=0.8

--conf spark.shuffle.memoryFraction=0.1

--conf spark.yarn.jar=/usr/hdp/current/spark-client/lib/spark-hdp-assembly.jar

--conf spark.yarn.executor.memoryOverhead=2048

--conf spark.akka.frameSize=100

--conf spark.driver.extraJavaOptions="-Xss10m -XX:MaxPermSize=512M -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC "

--conf spark.executor.extraJavaOptions="-Xss10m -XX:MaxPermSize=512M -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC "

--jars /usr/hdp/current/hive-client/lib/hive-common.jar, /usr/hdp/current/hive-client/lib/hive-hbase-handler.jar, /usr/hdp/current/hbase-client/lib/hbase-common.jar, /usr/hdp/current/hbase-client/lib/hbase-server.jar, /usr/hdp/current/hbase-client/lib/hbase-client.jar, /usr/hdp/current/hbase-client/lib/hbase-procedure.jar, /usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar, /usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar, /usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar, /usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar, hdfs://mycluster:8020/lib/java/dependencies/mysql-connector-java-5.0.8-bin.jar

hdfs://mycluster:8020/lib/scala/myjar.jar

1 ACCEPTED SOLUTION

avatar
New Member

In the compilation environment for myjar.jar was an old phoenix jar that had in it hbase-client-2.6.jar.

After removing it and compling a new jar it was fixed

View solution in original post

1 REPLY 1

avatar
New Member

In the compilation environment for myjar.jar was an old phoenix jar that had in it hbase-client-2.6.jar.

After removing it and compling a new jar it was fixed