Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Sqoop Export a transactional hive table with bucketing to sql

avatar
New Contributor

I am not able to export a transactional hive table to sqlserver.

Getting the error : Store into a partition with bucket definition from Pig/Mapreduce is not supported

Below is the structure of hive table

create table if not exists hive_part1
(id int, name string, age int)
partitioned by (gender string)
CLUSTERED BY(id) INTO 3 BUCKETS
STORED AS ORC TBLPROPERTIES ('transactional'='true');

insert into  table hive_part1 partition(gender) select * from temp_table;
 --temp_table has data
I am able to export orc table and partition table ( but i need to use bucketed table since i need to update the table )
This is what i am trying
sqoop export --connect 'jdbc:sqlserver://XXXX;database=xxx' --username xxxx- 
-password xxx --table hive_part  --hcatalog-database default --hcatalog-table hive_part  
--hive-partition-key gender --hive-partition-value 'male' -m 1
the above command works if the hive_part table is just orc and partitioned .

the above command works if the hive_part table is just orc and partitioned .
1 REPLY 1

avatar
@avinash nishanth

Sqoop with hcatalog does not work with bucketed hive table as this is not supported yet. ( import or export)

For the export, you might have to load the data from bucketed table to non bucketed table and then do a sqoop export of the non bucketed table.

Reference KB Link.