Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Failure to insert overwrite to a AWS S3 base external orc Hive table

Failure to insert overwrite to a AWS S3 base external orc Hive table

New Contributor

I have a internal hive table that gets backed up to an external table on s3 on daily basis using daily date based partition.

For one particular day it keeps failing with the S3 error show in the attachment.

capture.png

Below are the parameters setup in the hql file

set hive.execution.engine=mr; set hive.default.fileformat=Orc; set hive.exec.orc.default.compress=SNAPPY; set hive.exec.copyfile.maxsize=1099511627776; set hive.warehouse.subdir.inherit.perms=false; set hive.metastore.pre.event.listeners=; set hive.stats.fetch.partition.stats=false; set hive.exec.dynamic.partition.mode=nonstrict; set hive.exec.dynamic.partition=true; set fs.trash.interval=0; set fs.s3.buffer.dir=/tmp/s3a; set fs.s3a.attempts.maximum=50; set fs.s3a.connection.establish.timeout=120000; set fs.s3a.connection.timeout=120000; set fs.s3a.fast.upload=true; set fs.s3a.fast.upload.buffer=disk; set fs.s3a.multiobjectdelete.enable=true; set fs.s3a.max.total.tasks=2000; set fs.s3a.threads.core=30; set fs.s3a.threads.max=512; set fs.s3a.connection.maximum=30; set fs.s3a.fast.upload.active.blocks=12; set fs.s3a.threads.keepalivetime=120;

Don't have an account?
Coming from Hortonworks? Activate your account here