Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive: INSERT OVERWRITE does not work with PutHiveQL

Highlighted

Hive: INSERT OVERWRITE does not work with PutHiveQL

Rising Star

I am not able to run the below query using nifi .

insert overwrite local directory '/hadoop_shared/xxx/xxx/xxx/' row format delimited fields terminated by '|' null defined as ''
select
xxx;

Error :-

Error message:PutHiveQL[id=8e81378c-822c-1a67-8f1d-461f14368aea] Failed to update Hive for StandardFlowFileRecord[uuid=6874a32d-5c58-4454-8963-eaa20e67094b,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1516598821063-18354, container=default, section=946], offset=119486, length=493],offset=0,name=23963691567831492.avro.avro.avro.avro.avro.avro,size=493] due to org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [bdcontour2] does not have [WRITE] privilege on [/hadoop_shared/*]; it is possible that retrying the operation will succeed, so routing to retry: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [bdcontour2] does not have [WRITE] privilege on [/hadoop_shared/*]

1 REPLY 1
Highlighted

Re: Hive: INSERT OVERWRITE does not work with PutHiveQL

Looks like you have setup some security/access policies set up like with Ranger or something. Create/modify a policy to have the WRITE permission.

Don't have an account?
Coming from Hortonworks? Activate your account here