- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Permission Denied for user while creating a hive table using sqoop in Oozie workflow
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache YARN
Created ‎01-20-2016 08:08 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Environment : HDP 2.3.4, 3 Nodes, 32GB ram, redhat
I am running a sqoop job via Hue-oozie. The job is to import data from Mysql into Hive.
As I understand it, Sqoop first imports the data from MYSQL using JDBC and writes that into HDFS.
Then Sqoop calls Hive to create a table and move the same data into Hive warehouse to create a hive internal table.
So far, the part where Sqoop imports data from MYSQL completes successfully, BUT the task to create Hive table keeps on failing.
Below is the snippet for the error.
26225 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - org.apache.hadoop.security.AccessControlException: Permission denied: user=urep, access=EXECUTE, inode="/tmp/hive/yarn/_tez_session_dir/3007739a-c7fe-4730-a360-ba304646bc3b/hive-hcatalog-core.jar":yarn:hdfs:drwx------ 2016-01-20 09:48:21,069 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - org.apache.hadoop.security.AccessControlException: Permission denied: user=urep, access=EXECUTE, inode="/tmp/hive/yarn/_tez_session_dir/3007739a-c7fe-4730-a360-ba304646bc3b/hive-hcatalog-core.jar":yarn:hdfs:drwx------
What I HAVE TRIED so far:
I have given 777 permission to /tmp/hive/yarn/_tez_session_dir but did not help.
I added the hadoop user environment variable either but that didnt work.
Please help
Created ‎06-07-2016 09:44 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I resolved this issue by copying the hive-hcatalog-core.jar file directly under lib folder of the workflow
Created ‎01-20-2016 08:13 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Vinayak Agrawal try sudo -u hdfs hdfs dfs -chmod -R 777 /tmp/hive/yarn
Created on ‎01-20-2016 08:22 PM - edited ‎08-19-2019 04:55 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Artem Thanks for your reply. I have given 777 recursively to /tmp/hive/yarn which did not help
Created ‎01-20-2016 08:26 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
doesn't look like write permission for other took effect if you say 777 was applied. Is user urep part of any group or part of other?
Created ‎01-21-2016 06:43 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Vinayak : What is your /tmp dir permission and if it is not 777 then please make it to 777.
Created ‎02-02-2016 09:23 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Vinayak Agrawal are you still having problems with this? Can you provide your own solution or accept best answer?
Created ‎02-02-2016 09:38 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Vinayak Agrawal Are you running the sqoop job as user yarn? if not the user running the sqoop job should have the appropriate permissions eg 777 /tmp/hive/yarn/_tez_session_dir/xxxxx
Created ‎05-14-2016 01:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Vinayak Agrawal: Mind sharing your solution? I have exactly the same problem
Created ‎05-14-2016 02:09 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
try this
try modifying the workflow.xml and add
<exec>${execFile}</exec> 
<env-var>HADOOP_USER_NAME=${wf:user()}</env-var> <file>${execPath}#${execFile}</file>
Created ‎06-07-2016 09:44 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I resolved this issue by copying the hive-hcatalog-core.jar file directly under lib folder of the workflow
