Support Questions
Find answers, ask questions, and share your expertise
Alert: Please see the Cloudera blog for information on the Cloudera Response to CVE-2021-4428

Getting exceptions while exporting HBase table data to CSV file

Super Collaborator

I am using HDP2.4 sandbox and doing Hbase table export technique which creates a sequence file and puts it in HDFS. After this, I created a hive table stored as sequencefile but its giving me below exception when I am loading the sequence file data into this hive table:


WritableName can't load class:

The plan is to create one more hive table with comma delimiter specified and this table would select the data from the first hive table. Please let me know if you have experienced this before.


Super Collaborator

ImmutableBytesWritable is in hbase-common module.

Please make sure hbase-common jar (and other required jars) is on the classpath

Super Collaborator

$ jar tvf hbase-common- | grep ImmutableBytesWritable

5439 Wed Dec 16 03:41:54 PST 2015 org/apache/hadoop/hbase/io/ImmutableBytesWritable.class

1409 Wed Dec 16 03:41:54 PST 2015 org/apache/hadoop/hbase/io/ImmutableBytesWritable$Comparator.class

Super Collaborator

Its giving me the same output:

jar tvf hbase-common- | grep ImmutableBytesWritable

5439 Wed Feb 10 07:01:04 UTC 2016 org/apache/hadoop/hbase/io/ImmutableBytesWritable.class

1409 Wed Feb 10 07:01:04 UTC 2016 org/apache/hadoop/hbase/io/ImmutableBytesWritable$Comparator.class

Super Collaborator

Other jars that may be needed:




Super Collaborator

They are all present there.

You can try to use hive.aux.jars.path to inform Hive to add the hbase-common jar to the classpath. In your Hive script, before the rest of your logic:

set hive.aux.jars.path=/usr/hdp/current/hbase-client/lib/hbase-common.jar;

You can also provide the --hiveconf argument on the command line

hive --hiveconf hive.aux.jars.path=/usr/hdp/current/hbase-client/lib/hbase-common.jar ...

Super Collaborator

Hi @Josh Elser,

Tried that, now I am getting the different exception:

Failed with exception Wrong file format. Please check the file's format.

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask

Super Collaborator

Ran describe formatted table name in hive. Got the following output:

# Storage Information

SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

InputFormat: org.apache.hadoop.mapred.SequenceFileInputFormat


Sounds like you would need to tell Hive to use org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat instead of its own (maybe by using hive.input.format, I'm not 100% sure), but, even so, Hive cannot naturally read HBase sequence files. You're should probably try to export your HBase table as CSV instead of sequence files which Hive can naturally read. Otherwise you'll have to convert HBase's Sequence file (which is ImmutableBytesWritable to Result) to something Hive can consume.