Thank you for the reply, it is really helpful! I fixed the problem. It is because the AD account is 'Test' (upper case), but the Cloudera user account is 'test' (lower case). So I fixed the problem by changing the principal in MIT Kerberos from 'Test@EXAMPLE.COM' to 'test@EXAMPLE.COM'.
... View more
I am trying to import a table from Oracle to Hive using Sqoop. The following is my script:
sqoop import \
--connect jdbc:oracle:thin:@//myhost:1521/myservice \
-P mypassword \
--table mytable \
-m 1 \
--hive-table mytable \
I am using the user 'test', and I got the following error:
18/09/10 10:11:12 WARN security.UserGroupInformation: PriviledgedActionException as:test@TEST.COM (auth:KERBEROS) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=test, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
So it is saying that the user 'test' doesn't have write permission under /user, which is true. The owner of /user directory is hdfs.
hadoop fs -ls /
drwxr-xr-x - hdfs supergroup 0 2018-09-10 11:09 /user
But I have created a directory /user/test for the user 'test'.
hadoop fs -ls /user drwxr-xr-x - test test 0 2018-09-10 11:09 /user/test
And in my script, I set the target dir to
My question is why sqoop is trying to import to inode="/user"? rather than my target dir or the user's home dir /user/test? How to change the inode?
... View more