Support Questions

Find answers, ask questions, and share your expertise

Not able to Import or CopyTable in Hbase

avatar
Contributor

Hi All,

 

This is my first port here, I need a help to import or Copy table in HBase.

 

We have one table called EMPLOYEE in default name space and now I want to Copy or Import data from EMPLOYEE table to PROD:TEST_EMPLOYEE table, I tried with below commands but it was failed with below errors.

 

We are using CDH-5.8.0 and PROD:TEST_EMPLOYEE table has been created by using Apache Phoenix.

 

sudo -u hbase hbase -Dhbase.import.version=1.2 org.apache.hadoop.hbase.mapreduce.Import PROD:TEST_EMPLOYEE /user/hbase/EMPLOYEE

 

Import Error:

18/01/22 19:43:47 INFO mapreduce.Job: Running job: job_1516623205824_0008

18/01/22 19:43:56 INFO mapreduce.Job: Job job_1516623205824_0008 running in uber mode : false

18/01/22 19:43:56 INFO mapreduce.Job:  map 0% reduce 0%

18/01/22 19:44:03 INFO mapreduce.Job: Task Id : attempt_1516623205824_0008_m_000000_0, Status : FAILED

Error: org.apache.hadoop.hbase.client.Put.setClusterIds(Ljava/util/List;)V

18/01/22 19:44:09 INFO mapreduce.Job: Task Id : attempt_1516623205824_0008_m_000000_1, Status : FAILED

Error: org.apache.hadoop.hbase.client.Put.setClusterIds(Ljava/util/List;)V

18/01/22 19:44:16 INFO mapreduce.Job: Task Id : attempt_1516623205824_0008_m_000000_2, Status : FAILED

Error: org.apache.hadoop.hbase.client.Put.setClusterIds(Ljava/util/List;)V

18/01/22 19:44:32 INFO mapreduce.Job:  map 100% reduce 0%

18/01/22 19:44:32 INFO mapreduce.Job: Job job_1516623205824_0008 failed with state FAILED due to: Task failed task_1516623205824_0008_m_000000

Job failed as tasks failed. failedMaps:1 failedReduces:0

 

CopyTable Error:

18/01/22 19:54:24 INFO mapreduce.Job: Job job_1516623205824_0009 running in uber mode : false

18/01/22 19:54:24 INFO mapreduce.Job:  map 0% reduce 0%

18/01/22 19:54:35 INFO mapreduce.Job: Task Id : attempt_1516623205824_0009_m_000000_0, Status : FAILED

Error: org.apache.hadoop.hbase.client.Put.setClusterIds(Ljava/util/List;)V

18/01/22 19:54:46 INFO mapreduce.Job: Task Id : attempt_1516623205824_0009_m_000000_1, Status : FAILED

Error: org.apache.hadoop.hbase.client.Put.setClusterIds(Ljava/util/List;)V

18/01/22 19:54:56 INFO mapreduce.Job: Task Id : attempt_1516623205824_0009_m_000000_2, Status : FAILED

Error: org.apache.hadoop.hbase.client.Put.setClusterIds(Ljava/util/List;)V

18/01/22 19:55:04 INFO mapreduce.Job:  map 100% reduce 0%

18/01/22 19:55:04 INFO mapreduce.Job: Job job_1516623205824_0009 failed with state FAILED due to: Task failed task_1516623205824_0009_m_000000

 

 

Thanks,

Bhavesh

 

1 ACCEPTED SOLUTION

avatar
Contributor

 

I resolved this issue long back.

 

I had to copy hbase-client jars into mapreduce's lib directory which resolved my issue.

 

Thanks,

Bhavesh

View solution in original post

3 REPLIES 3

avatar
Rising Star

Are there any coprocessors defined in the phoenix created table. If so are those jars available in your prod cluster?

avatar
Contributor

Hi RobertM,

 

Thank you very much for looking into this issue.

 

Yes, those all Jars are available in PROD cluster, it's not a different cluster, I am trying to import/copy table in same cluster but in different table which has same table structure (I mean column family, fields etc).

 

I have tried with both type of tables, 1) table mapped with phoenix and 2) table not mapped with phoenix

But result is same, export is working fine but import/copy table throws errors.

 

Thanks,

Bhavesh Vadaliya

avatar
Contributor

 

I resolved this issue long back.

 

I had to copy hbase-client jars into mapreduce's lib directory which resolved my issue.

 

Thanks,

Bhavesh