Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Permission Denied for user while creating a hive table using sqoop in Oozie workflow

avatar
New Contributor

Hi,

Environment : HDP 2.3.4, 3 Nodes, 32GB ram, redhat

I am running a sqoop job via Hue-oozie. The job is to import data from Mysql into Hive.

As I understand it, Sqoop first imports the data from MYSQL using JDBC and writes that into HDFS.

Then Sqoop calls Hive to create a table and move the same data into Hive warehouse to create a hive internal table.

So far, the part where Sqoop imports data from MYSQL completes successfully, BUT the task to create Hive table keeps on failing.

Below is the snippet for the error.

26225 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  - org.apache.hadoop.security.AccessControlException: Permission denied: user=urep, access=EXECUTE, inode="/tmp/hive/yarn/_tez_session_dir/3007739a-c7fe-4730-a360-ba304646bc3b/hive-hcatalog-core.jar":yarn:hdfs:drwx------
2016-01-20 09:48:21,069 INFO  [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - org.apache.hadoop.security.AccessControlException: Permission denied: user=urep, access=EXECUTE, inode="/tmp/hive/yarn/_tez_session_dir/3007739a-c7fe-4730-a360-ba304646bc3b/hive-hcatalog-core.jar":yarn:hdfs:drwx------

What I HAVE TRIED so far:

1. https://community.hortonworks.com/questions/8158/permission-denied-for-userhive-on-load-data-inpath....

I have given 777 permission to /tmp/hive/yarn/_tez_session_dir but did not help.

2. https://community.hortonworks.com/questions/7642/oozie-shell-action-permission-denied-userxyz-acces....

I added the hadoop user environment variable either but that didnt work.

Please help

1 ACCEPTED SOLUTION

avatar
Contributor

I resolved this issue by copying the hive-hcatalog-core.jar file directly under lib folder of the workflow

View solution in original post

10 REPLIES 10

avatar
Master Mentor

@Vinayak Agrawal try sudo -u hdfs hdfs dfs -chmod -R 777 /tmp/hive/yarn

avatar
New Contributor

Artem Thanks for your reply. I have given 777 recursively to /tmp/hive/yarn which did not help

1480-hivbx.png

avatar
Master Mentor

doesn't look like write permission for other took effect if you say 777 was applied. Is user urep part of any group or part of other?

avatar
Guru

@Vinayak : What is your /tmp dir permission and if it is not 777 then please make it to 777.

avatar
Master Mentor

@Vinayak Agrawal are you still having problems with this? Can you provide your own solution or accept best answer?

avatar
Master Mentor

@Vinayak Agrawal Are you running the sqoop job as user yarn? if not the user running the sqoop job should have the appropriate permissions eg 777 /tmp/hive/yarn/_tez_session_dir/xxxxx

avatar
Expert Contributor

@Vinayak Agrawal: Mind sharing your solution? I have exactly the same problem

avatar
Master Mentor
@Vinayak Agrawal

try this

try modifying the workflow.xml and add

<exec>${execFile}</exec> 

<env-var>HADOOP_USER_NAME=${wf:user()}</env-var> 
<file>${execPath}#${execFile}</file> 

avatar
Contributor

I resolved this issue by copying the hive-hcatalog-core.jar file directly under lib folder of the workflow