Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

insert from pig into hive bucket

Solved Go to solution

insert from pig into hive bucket

Super Collaborator

Hi: after try to insert to hive table bucket i receive this error:

Caused by: org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not supported : Store into a partition with bucket definition from Pig/Mapreduce is not supported

I am using this :

STORE f INTO 'default.canal_v4' USING org.apache.hive.hcatalog.pig.HCatStorer();

please from pig how can i insert to hive with bucket??? any other option to performance this??'

thanks

1 ACCEPTED SOLUTION

Accepted Solutions

Re: insert from pig into hive bucket

Expert Contributor

HCatalog does not support writing into a bucketed table. HCat explicitly checks if a table is bucketed, and if so disable storing to it to avoid writing to the table in a destructive way.

From HCatOutputFormat:

if (sd.getBucketCols() != null && !sd.getBucketCols().isEmpty()) {  
throw new HCatException(ErrorType.ERROR_NOT_SUPPORTED, "Store into a partition with bucket definition from Pig/Mapreduce is not supported");  
}
3 REPLIES 3
Highlighted

Re: insert from pig into hive bucket

New Contributor

@Roberto Sancho.. Similar to Partitioned tables we cannot directly load bucketed tables rather we need to use INSERT OVERWRITE TABLE... SELECT.. FROM clause from another table to populate the bucketed table. Also hive.enforce.bucketing should be set to TRUE, so that number of reducers need not be specified exclusively. Kindly let me know if you find this useful.

Re: insert from pig into hive bucket

New Contributor

@Mayank Shekhar

INSERT OVERWRITE TABLE my_bucketted_table SELECT * FROM my_non_bucketed_table; gives this error

FAILED: SemanticException [Error 10295]: INSERT OVERWRITE not allowed on table with OutputFormat that implements AcidOutputFormat while transaction manager that supports ACID is in use

any solution for this ?

Re: insert from pig into hive bucket

Expert Contributor

HCatalog does not support writing into a bucketed table. HCat explicitly checks if a table is bucketed, and if so disable storing to it to avoid writing to the table in a destructive way.

From HCatOutputFormat:

if (sd.getBucketCols() != null && !sd.getBucketCols().isEmpty()) {  
throw new HCatException(ErrorType.ERROR_NOT_SUPPORTED, "Store into a partition with bucket definition from Pig/Mapreduce is not supported");  
}
Don't have an account?
Coming from Hortonworks? Activate your account here