I am using the spark sql and turned the impersonation ON by setting these properties
hive.server2.enable.doAs=true
spark.jars=/usr/hdp/current/spark-thriftserver/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/current/spark-thriftserver/lib/datanucleus-core-3.2.10.jar,/usr/hdp/current/spark-thriftserver/lib/datanucleus-rdbms-3.2.9.jar
Versions :-
Connected to: Spark SQL (version 2.3.0.2.6.5.0-292)
Driver: Hive JDBC (version 1.2.1000.2.6.5.0-292)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1000.2.6.5.0-292 by Apache Hive
After that I can create a table and even do a LOAD command and it works
But When I try to do
insert overwrite table t2
SELECT 1
from
temp_drivers
; --> Fails with Access Issue
In the log it writes
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/apps/hive/warehouse/temp_drivers/.hive-staging_hive_2018-07-11_09-12-26_095_5506831009661495310-7/-ext-10000/_temporary/0/_temporary/attempt_20180711091226_0006_m_000000_0/part-00000-ac36f69b-b3af-48b6-bd3a-318efb2ad0fb-c000":clsadmin:hadoop:drwxr-xr-x
clsadmin is the user who is invoking the commands from the beeline command prompt.
Also pls confirm if Impersonation is ON what does the ACL on the table and files looks like ..In my case it appears like this.
hadoop fs -ls /apps/hive/warehouse
Found 1 items
drwxrwxrwx - hive hadoop 0 2018-07-11 10:02 /apps/hive/warehouse/temp_drivers
[root@chs-xrv-474-mn001 log]# hadoop fs -ls /apps/hive/warehouse/temp_drivers
Found 1 items
-rwxrwxrwx 3 clsadmin biusers 2043 2018-07-11 04:58 /apps/hive/warehouse/temp_drivers/drivers.csv
To me it seems the impersonation is not applied properly as the /apps/hive/warehouse/temp_drivers has the owner as hive.