Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

[Error]:Accessing hbase table with Spark's HiveContext

Solved Go to solution

[Error]:Accessing hbase table with Spark's HiveContext

Rising Star

Hi,

I have registered the hive external table on hbase table.

When I try to access that through hiveContext

getting below error

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in sta        ge 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, i        p-172-31-29-201.ap-southeast-1.compute.internal): java.lang.RuntimeException: hb        ase-default.xml file seems to be for an older version of HBase (null), this vers        ion is 1.1.2.2.3.4.0-3485
        at org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBase        Configuration.java:71)
        at org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseCon        figuration.java:81)

I have already placed hbase-default.xml ,hbase-site.xml in spark/conf by setting the below property to true

 <property>
    <name>hbase.defaults.for.version.skip</name>
    <value>true</value>
    <description>Set to true to skip the 'hbase.defaults.for.version' check.
    Setting this to true can be useful in contexts other than
    the other side of a maven generation; i.e. running in an
    IDE.  You'll want to set this boolean to true to avoid
    seeing the RuntimeException complaint: "hbase-default.xml file
    seems to be for and old version of HBase (\${hbase.version}), this
    version is X.X.X-SNAPSHOT"</description>
  </property>

Spark code :

import org.apache.spark.sql.hive.HiveContextval hiveContext = new HiveContext(sc)val df = hiveContext.sql("select * from test")df.show

Adding these jars while submitting starting spark shell

/usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar
/usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler-1.2.1.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hive/lib/htrace-core-3.1.0-incubating.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-client-1.1.2.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-common-1.1.2.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-protocol-1.1.2.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-server-1.1.2.2.3.4.0-3485.jar
1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: [Error]:Accessing hbase table with Spark's HiveContext

Rising Star

Able to resolve it as missing one of the jars files

hbase-hadoop-compat.jar:

1 REPLY 1
Highlighted

Re: [Error]:Accessing hbase table with Spark's HiveContext

Rising Star

Able to resolve it as missing one of the jars files

hbase-hadoop-compat.jar: