09-09-2018 04:44 PM - last edited on 09-10-2018 11:22 AM by cjervis
I am trying to import a table from Oracle to Hive using Sqoop. The following is my script:
sqoop import \ --connect jdbc:oracle:thin:@//myhost:1521/myservice \ --username myuser\ -P mypassword \ --table mytable \ -m 1 \ --hive-overwrite \ --hive-import \ --hive-drop-import-delims \ --hive-table mytable \ --hive-database mydb\ --target-dir /data/test/mytable
I am using the user 'test', and I got the following error:
18/09/10 10:11:12 WARN security.UserGroupInformation: PriviledgedActionException as:test@TEST.COM (auth:KERBEROS) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=test, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
So it is saying that the user 'test' doesn't have write permission under /user, which is true. The owner of /user directory is hdfs.
hadoop fs -ls / drwxr-xr-x - hdfs supergroup 0 2018-09-10 11:09 /user
But I have created a directory /user/test for the user 'test'.
hadoop fs -ls /user
drwxr-xr-x - test test 0 2018-09-10 11:09 /user/test
And in my script, I set the target dir to
My question is why sqoop is trying to import to inode="/user"? rather than my target dir or the user's home dir /user/test? How to change the inode?
09-09-2018 06:32 PM
09-10-2018 02:34 PM
Thank you for the reply, it is really helpful!
I fixed the problem.
It is because the AD account is 'Test' (upper case), but the Cloudera user account is 'test' (lower case).
So I fixed the problem by changing the principal in MIT Kerberos from 'Test@EXAMPLE.COM' to 'test@EXAMPLE.COM'.