Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

My question is about sqoop export between hbase and mysql!

My question is about sqoop export between hbase and mysql!

New Contributor

I want to export data from Hbase into Mysql database. So I use sqoop to do this job.

First I export hbase into a folder. The command is "hbase org.apache.hadoop.hbase.mapreduce.Driver export h_small /tmp/dfsmall".

The next command is "sqoop export --connect jdbc:mysql://192.168.0.132:3306/test --username apps --password apps --table from_h_page_template --export-dir /tmp/dfsmall --input-fields-terminated-by '\001' --columns id,name,pid".

Now exception is showed. The exception is "java.lang.RuntimeException: java.io.IOException: WritableName can't load class: org.apache.hadoop.hbase.io.ImmutableBytesWritable".

 

What is wrong? My CDH version is 5.1

4 REPLIES 4

Re: My question is about sqoop export between hbase and mysql!

Expert Contributor
This seems like the HBase jars are missing. Could you check /usr/lib/hbase or /opt/cloudera/parcels/<CDH parscel>/lib/hbase to make sure you have an hbase*.jar that contains 'ImmutableBytesWritable'. If you do, then make sure HBASE_HOME is properly set.

Re: My question is about sqoop export between hbase and mysql!

New Contributor

Yes. I could find hbase-common-0.96.1.1-cdh5.0.3.jar in /usr/lib/hbase and /usr/lib/hbase/lib. I installed CDH5 with "cloudera-manager-installer.bin". I didn't set HBASE_HOME before. So in which file could I set HBASE_HOME?

Highlighted

Re: My question is about sqoop export between hbase and mysql!

Expert Contributor
HBASE_HOME can be set in your .bashrc file.

Could you provide a full stack trace if that doesn't help? You can get a full stack trace by adding --verbose at the beginning of the Sqoop command.

Re: My question is about sqoop export between hbase and mysql!

New Contributor

In my system there is only one bashrc file under /etc. I will new a sh file in /etc/profile.d. And set "export HBASE_HOME=/usr/lib/hbase". I don't think I do the right thing. But in etc/hbase there are only some config files.

 

 

I add -verbose after sqoop command, and show the followed exceptoin

 

 

14/11/13 10:09:11 INFO mapreduce.Job: map 0% reduce 0% 14/11/13 10:09:15 INFO mapreduce.Job: Task Id : attempt_1415593832420_0037_m_000000_0, Status : FAILED Error: java.lang.RuntimeException: java.io.IOException: WritableName can't load class: org.apache.hadoop.hbase.io.ImmutableBytesWritable at org.apache.hadoop.io.SequenceFile$Reader.getKeyClass(SequenceFile.java:2013) at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1944) at org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1810) at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1759) at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1773) at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.initialize(SequenceFileRecordReader.java:54) at org.apache.sqoop.mapreduce.CombineShimRecordReader.initialize(CombineShimRecordReader.java:76) at org.apache.sqoop.mapreduce.CombineFileRecordReader.initialize(CombineFileRecordReader.java:64) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:525) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.io.IOException: WritableName can't load class: org.apache.hadoop.hbase.io.ImmutableBytesWritable at org.apache.hadoop.io.WritableName.getClass(WritableName.java:77) at org.apache.hadoop.io.SequenceFile$Reader.getKeyClass(SequenceFile.java:2011) ... 15 more Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hbase.io.ImmutableBytesWritable not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1801) at org.apache.hadoop.io.WritableName.getClass(WritableName.java:75) ... 16 more 14/11/13 10:09:23 INFO mapreduce.Job: Task Id : attempt_1415593832420_0037_m_000002_0, Status : FAILED Error: java.lang.RuntimeException: java.io.IOException: WritableName can't load class: org.apache.hadoop.hbase.io.ImmutableBytesWritable at org.apache.hadoop.io.SequenceFile$Reader.getKeyClass(SequenceFile.java:2013)

Don't have an account?
Coming from Hortonworks? Activate your account here