Options
- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Sqoop Export a transactional hive table with bucketing to sql
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
New Contributor
Created ‎04-02-2018 07:39 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am not able to export a transactional hive table to sqlserver.
Getting the error : Store into a partition with bucket definition from Pig/Mapreduce is not supported
Below is the structure of hive table
create table if not exists hive_part1
(id int, name string, age int)
partitioned by (gender string)
CLUSTERED BY(id) INTO 3 BUCKETS
STORED AS ORC TBLPROPERTIES ('transactional'='true');
insert into table hive_part1 partition(gender) select * from temp_table;
--temp_table has data
I am able to export orc table and partition table ( but i need to use bucketed table since i need to update the table )
This is what i am trying
sqoop export --connect 'jdbc:sqlserver://XXXX;database=xxx' --username xxxx- -password xxx --table hive_part --hcatalog-database default --hcatalog-table hive_part --hive-partition-key gender --hive-partition-value 'male' -m 1 the above command works if the hive_part table is just orc and partitioned .
the above command works if the hive_part table is just orc and partitioned .
1 REPLY 1
Guru
Created ‎04-02-2018 08:47 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@avinash nishanth
Sqoop with hcatalog does not work with bucketed hive table as this is not supported yet. ( import or export)
For the export, you might have to load the data from bucketed table to non bucketed table and then do a sqoop export of the non bucketed table.
Reference KB Link.
