Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Permission denied when importing from Oracle to Hive using Sqoop

Permission denied when importing from Oracle to Hive using Sqoop

New Contributor

Hi All

 

I am trying to import a table from Oracle to Hive using Sqoop. The following is my script:

sqoop import \
--connect jdbc:oracle:thin:@//myhost:1521/myservice \
--username myuser\
-P mypassword \
--table mytable \
-m 1 \
--hive-overwrite \
--hive-import \
--hive-drop-import-delims \
--hive-table mytable \
--hive-database mydb\
--target-dir /data/test/mytable 

I am using the user 'test', and I got the following error:

 

18/09/10 10:11:12 WARN security.UserGroupInformation: PriviledgedActionException as:test@TEST.COM (auth:KERBEROS) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=test, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

 

So it is saying that the user 'test' doesn't have write permission under /user, which is true. The owner of /user directory is hdfs.

 

hadoop fs -ls /
drwxr-xr-x - hdfs supergroup 0 2018-09-10 11:09 /user

 

But I have created a directory /user/test for the user 'test'.

hadoop fs -ls /user
drwxr-xr-x - test test 0 2018-09-10 11:09 /user/test

And in my script, I set the target dir to 

/data/test/mytable

 My question is why sqoop is trying to import to inode="/user"? rather than my target dir or the user's home dir /user/test? How to change the inode?

 

Thank you.

2 REPLIES 2

Re: Permission denied when importing from Oracle to Hive using Sqoop

Master Guru
> My question is why sqoop is trying to import to inode="/user"? rather
than my target dir or the user's home dir /user/test? How to change the
inode?

The /user access comes from Sqoop wanting to run an MR2 job on YARN. To run
a job, the job's necessary program and associated metadata must be written
to HDFS and then triggered via the Resource Manager. So its not Sqoop
trying to use /user for its work, but rather Sqoop's use of MR2 is the one
behind it.

The default location a cluster uses for writing these files is of the form
/user/username/.staging/job_id. If the filesystem layer finds that
/user/username does not exist, it attempts to try and create one, which
fails usually because /user is write protected outside of superusers. If it
finds one already existing, it does not try this.

Note that its a requirement that you create the HDFS home directory for
every user you want to be running programs on the cluster:
https://www.cloudera.com/documentation/enterprise/latest/topics/cm_sg_s7_prepare_cluster.html#xd_583...

> But I have created a directory /user/test for the user 'test'.

Does the error repeat after this? If yes, could you share the full stack
trace log lines containing the AccessControlException.
Highlighted

Re: Permission denied when importing from Oracle to Hive using Sqoop

New Contributor

Thank you for the reply, it is really helpful!

 

I fixed the problem.

It is because the AD account is 'Test' (upper case), but the Cloudera user account is 'test' (lower case).

So I fixed the problem by changing the principal in MIT Kerberos from 'Test@EXAMPLE.COM' to 'test@EXAMPLE.COM'.