I am not able to export a transactional hive table to sqlserver.
Getting the error : Store into a partition with bucket definition from Pig/Mapreduce is not supported
Below is the structure of hive table
create table if not exists hive_part1
(id int, name string, age int)
partitioned by (gender string)
CLUSTERED BY(id) INTO 3 BUCKETS
STORED AS ORC TBLPROPERTIES ('transactional'='true');
insert into table hive_part1 partition(gender) select * from temp_table;
--temp_table has data
I am able to export orc table and partition table ( but i need to use bucketed table since i need to update the table )
This is what i am trying
sqoop export --connect 'jdbc:sqlserver://XXXX;database=xxx' --username xxxx-
-password xxx --table hive_part --hcatalog-database default --hcatalog-table hive_part
--hive-partition-key gender --hive-partition-value 'male' -m 1
the above command works if the hive_part table is just orc and partitioned .
the above command works if the hive_part table is just orc and partitioned .