Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

insert from pig into hive bucket

avatar
Master Collaborator

Hi: after try to insert to hive table bucket i receive this error:

Caused by: org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not supported : Store into a partition with bucket definition from Pig/Mapreduce is not supported

I am using this :

STORE f INTO 'default.canal_v4' USING org.apache.hive.hcatalog.pig.HCatStorer();

please from pig how can i insert to hive with bucket??? any other option to performance this??'

thanks

1 ACCEPTED SOLUTION

avatar
Super Collaborator

HCatalog does not support writing into a bucketed table. HCat explicitly checks if a table is bucketed, and if so disable storing to it to avoid writing to the table in a destructive way.

From HCatOutputFormat:

if (sd.getBucketCols() != null && !sd.getBucketCols().isEmpty()) {  
throw new HCatException(ErrorType.ERROR_NOT_SUPPORTED, "Store into a partition with bucket definition from Pig/Mapreduce is not supported");  
}

View solution in original post

3 REPLIES 3

avatar
Contributor

@Roberto Sancho.. Similar to Partitioned tables we cannot directly load bucketed tables rather we need to use INSERT OVERWRITE TABLE... SELECT.. FROM clause from another table to populate the bucketed table. Also hive.enforce.bucketing should be set to TRUE, so that number of reducers need not be specified exclusively. Kindly let me know if you find this useful.

avatar
New Contributor

@Mayank Shekhar

INSERT OVERWRITE TABLE my_bucketted_table SELECT * FROM my_non_bucketed_table; gives this error

FAILED: SemanticException [Error 10295]: INSERT OVERWRITE not allowed on table with OutputFormat that implements AcidOutputFormat while transaction manager that supports ACID is in use

any solution for this ?

avatar
Super Collaborator

HCatalog does not support writing into a bucketed table. HCat explicitly checks if a table is bucketed, and if so disable storing to it to avoid writing to the table in a destructive way.

From HCatOutputFormat:

if (sd.getBucketCols() != null && !sd.getBucketCols().isEmpty()) {  
throw new HCatException(ErrorType.ERROR_NOT_SUPPORTED, "Store into a partition with bucket definition from Pig/Mapreduce is not supported");  
}