Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive import in Sqoop job - error=13, Permission denied

Highlighted

Hive import in Sqoop job - error=13, Permission denied

Rising Star

I've got a problem with running sqoop job in Oozie using Hue. I have 4 nodes cluster based on Hortonworks HDP.

My Sqoop job looks like below:

import 
--options-file ./dss_conn_parms.txt 
--table BD.BD_TABLE
--target-dir /user/user_1/DMS
--m 1 
--hive-import 
--hive-table BD.BD_TABLE_HIVE

The data from Oracle database was succesfully downloaded and inserted to HDFS. Unfortunately, Hive import doesn't work. The error is associated with permission:

73167 [main] INFO  org.apache.sqoop.hive.HiveImport  - Loading uploaded data into Hive
2016-10-17 09:42:55,203 INFO  [main] hive.HiveImport (HiveImport.java:importTable(195)) - Loading uploaded data into Hive
73180 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered IOException running import job: java.io.IOException: Cannot run program "hive": error=13, Permission denied
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at java.lang.Runtime.exec(Runtime.java:620)
at java.lang.Runtime.exec(Runtime.java:528)
at org.apache.sqoop.util.Executor.exec(Executor.java:76)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:391)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:344)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:245)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.io.IOException: error=13, Permission denied
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
... 31 more

2016-10-17 09:42:55,216 ERROR [main] tool.ImportTool (ImportTool.java:run(613)) - Encountered IOException running import job: java.io.IOException: Cannot run program "hive": error=13, Permission denied
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at java.lang.Runtime.exec(Runtime.java:620)
at java.lang.Runtime.exec(Runtime.java:528)
at org.apache.sqoop.util.Executor.exec(Executor.java:76)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:391)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:344)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:245)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.io.IOException: error=13, Permission denied
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
... 31 more

Intercepting System.exit(1)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]


I executed Sqoop job with hive import options in command line and I know what is the problem. In command line I can see this info:

Logging initialized using configuration in jar:file:/usr/hdp/2.4.2.0-258 /hive/lib/hive-common-1.2.1000.2.4.2.0-jar!/hive-log4j.properties
OK

The problem is with access to hive-common-1.2.1000.2.4.2.0-jar which is located on local file system. Any idea what should I do?

5 REPLIES 5
Highlighted

Re: Hive import in Sqoop job - error=13, Permission denied

Guru

It looks like the permissions on the directory holding the hive table does not allow write from sqoop. Find the directory of the hive table and either open write permissons to all (hdfs dfs -chmod -R ...) or change the owner to include the owner running sqoop (hdfs dfs -chown -R ...)

Highlighted

Re: Hive import in Sqoop job - error=13, Permission denied

Cloudera Employee

is it failing all the time ?

have you tried adding the attribute --hive-home "<path to hive" ?

Highlighted

Re: Hive import in Sqoop job - error=13, Permission denied

New Contributor

Try adding hive-home and create-hive-table in your parameter.

Highlighted

Re: Hive import in Sqoop job - error=13, Permission denied

Expert Contributor

Are you using Ranger, and have you set any policies, if you have done so, you might have to check the policies to make sure they are not conflicting with the Oozie user. If you have not set up Ranger policies, then updating ACLS and file level permissions could solve your problem.

Highlighted

Re: Hive import in Sqoop job - error=13, Permission denied

New Contributor

On my env (HDP 2.6.2) the issue was that sqoop and hive ran as different users (submitting user and "yarn" respectively).
As such hive didn't have write access to the sqooped files to delete them after hive-import.

Only resolution i could find was giving yarn superuser on hdfs.

As hdfs admin:

hdfs dfs -mkdir /user/yarn
hdfs dfs -chown yarn /user/yarn

as sudoer to root:

sudo useradd -G hdfs,hadoop yarn

Don't have an account?
Coming from Hortonworks? Activate your account here