<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question When running Spark job getting Error: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCacheBlocks(Z)V in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/When-running-Spark-job-getting-Error-java-lang/m-p/143940#M52421</link>
    <description>&lt;P&gt;I'am Running Spark Job that does hbase scan. However I get an error java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCacheBlocks(Z)V&lt;/P&gt;&lt;P&gt;As I looked it up, it is caused by version mismatch between the hbase-client.jar to hbase version. However I used only hdp compiled jars.&lt;/P&gt;&lt;P&gt;My HDP version is 2.4.3.0&lt;/P&gt;&lt;P&gt;I run the sumbit the flowwing way:&lt;/P&gt;&lt;P&gt;export HADOOP_CONF_DIR=/etc/hadoop/conf/&lt;/P&gt;&lt;P&gt;
export SPARK_CONF_DIR=/etc/spark/conf &lt;/P&gt;&lt;P&gt;/usr/hdp/current/spark-client/bin/spark-submit &lt;/P&gt;&lt;P&gt;
--class MyClass&lt;/P&gt;&lt;P&gt;
--master yarn-cluster &lt;/P&gt;&lt;P&gt;
--num-executors 4 &lt;/P&gt;&lt;P&gt;
--driver-memory 1g &lt;/P&gt;&lt;P&gt;
--executor-memory 4g &lt;/P&gt;&lt;P&gt;
--executor-cores 6 &lt;/P&gt;&lt;P&gt; 
--conf spark.driver.cores=6 &lt;/P&gt;&lt;P&gt;
--conf spark.storage.memoryFraction=0.8 &lt;/P&gt;&lt;P&gt;
--conf spark.shuffle.memoryFraction=0.1 &lt;/P&gt;&lt;P&gt;
--conf spark.yarn.jar=/usr/hdp/current/spark-client/lib/spark-hdp-assembly.jar &lt;/P&gt;&lt;P&gt;
--conf spark.yarn.executor.memoryOverhead=2048 &lt;/P&gt;&lt;P&gt;
--conf spark.akka.frameSize=100 &lt;/P&gt;&lt;P&gt;
--conf spark.driver.extraJavaOptions="-Xss10m -XX:MaxPermSize=512M -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC " &lt;/P&gt;&lt;P&gt;--conf spark.executor.extraJavaOptions="-Xss10m -XX:MaxPermSize=512M -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC " &lt;/P&gt;&lt;P&gt;
--jars /usr/hdp/current/hive-client/lib/hive-common.jar,
/usr/hdp/current/hive-client/lib/hive-hbase-handler.jar,
/usr/hdp/current/hbase-client/lib/hbase-common.jar,
/usr/hdp/current/hbase-client/lib/hbase-server.jar,
/usr/hdp/current/hbase-client/lib/hbase-client.jar,
/usr/hdp/current/hbase-client/lib/hbase-procedure.jar,
/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar,
/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar,
/usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar,
/usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar,
hdfs://mycluster:8020/lib/java/dependencies/mysql-connector-java-5.0.8-bin.jar &lt;/P&gt;&lt;P&gt;
hdfs://mycluster:8020/lib/scala/myjar.jar
&lt;/P&gt;</description>
    <pubDate>Tue, 24 Jan 2017 19:20:53 GMT</pubDate>
    <dc:creator>ran</dc:creator>
    <dc:date>2017-01-24T19:20:53Z</dc:date>
    <item>
      <title>When running Spark job getting Error: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCacheBlocks(Z)V</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/When-running-Spark-job-getting-Error-java-lang/m-p/143940#M52421</link>
      <description>&lt;P&gt;I'am Running Spark Job that does hbase scan. However I get an error java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCacheBlocks(Z)V&lt;/P&gt;&lt;P&gt;As I looked it up, it is caused by version mismatch between the hbase-client.jar to hbase version. However I used only hdp compiled jars.&lt;/P&gt;&lt;P&gt;My HDP version is 2.4.3.0&lt;/P&gt;&lt;P&gt;I run the sumbit the flowwing way:&lt;/P&gt;&lt;P&gt;export HADOOP_CONF_DIR=/etc/hadoop/conf/&lt;/P&gt;&lt;P&gt;
export SPARK_CONF_DIR=/etc/spark/conf &lt;/P&gt;&lt;P&gt;/usr/hdp/current/spark-client/bin/spark-submit &lt;/P&gt;&lt;P&gt;
--class MyClass&lt;/P&gt;&lt;P&gt;
--master yarn-cluster &lt;/P&gt;&lt;P&gt;
--num-executors 4 &lt;/P&gt;&lt;P&gt;
--driver-memory 1g &lt;/P&gt;&lt;P&gt;
--executor-memory 4g &lt;/P&gt;&lt;P&gt;
--executor-cores 6 &lt;/P&gt;&lt;P&gt; 
--conf spark.driver.cores=6 &lt;/P&gt;&lt;P&gt;
--conf spark.storage.memoryFraction=0.8 &lt;/P&gt;&lt;P&gt;
--conf spark.shuffle.memoryFraction=0.1 &lt;/P&gt;&lt;P&gt;
--conf spark.yarn.jar=/usr/hdp/current/spark-client/lib/spark-hdp-assembly.jar &lt;/P&gt;&lt;P&gt;
--conf spark.yarn.executor.memoryOverhead=2048 &lt;/P&gt;&lt;P&gt;
--conf spark.akka.frameSize=100 &lt;/P&gt;&lt;P&gt;
--conf spark.driver.extraJavaOptions="-Xss10m -XX:MaxPermSize=512M -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC " &lt;/P&gt;&lt;P&gt;--conf spark.executor.extraJavaOptions="-Xss10m -XX:MaxPermSize=512M -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC " &lt;/P&gt;&lt;P&gt;
--jars /usr/hdp/current/hive-client/lib/hive-common.jar,
/usr/hdp/current/hive-client/lib/hive-hbase-handler.jar,
/usr/hdp/current/hbase-client/lib/hbase-common.jar,
/usr/hdp/current/hbase-client/lib/hbase-server.jar,
/usr/hdp/current/hbase-client/lib/hbase-client.jar,
/usr/hdp/current/hbase-client/lib/hbase-procedure.jar,
/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar,
/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar,
/usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar,
/usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar,
hdfs://mycluster:8020/lib/java/dependencies/mysql-connector-java-5.0.8-bin.jar &lt;/P&gt;&lt;P&gt;
hdfs://mycluster:8020/lib/scala/myjar.jar
&lt;/P&gt;</description>
      <pubDate>Tue, 24 Jan 2017 19:20:53 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/When-running-Spark-job-getting-Error-java-lang/m-p/143940#M52421</guid>
      <dc:creator>ran</dc:creator>
      <dc:date>2017-01-24T19:20:53Z</dc:date>
    </item>
    <item>
      <title>Re: When running Spark job getting Error: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCacheBlocks(Z)V</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/When-running-Spark-job-getting-Error-java-lang/m-p/143941#M52422</link>
      <description>&lt;P&gt;In the compilation environment for myjar.jar was an old phoenix jar that had in it hbase-client-2.6.jar.&lt;/P&gt;&lt;P&gt;After removing it and compling a new jar it was fixed&lt;/P&gt;</description>
      <pubDate>Tue, 24 Jan 2017 23:04:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/When-running-Spark-job-getting-Error-java-lang/m-p/143941#M52422</guid>
      <dc:creator>ran</dc:creator>
      <dc:date>2017-01-24T23:04:17Z</dc:date>
    </item>
  </channel>
</rss>

