Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Access control exception for hive tables created using pyspark

Access control exception for hive tables created using pyspark

Explorer

In HIVE I have a table that was created by using pyspark.

I have created the table like below.

df = sqlContext.read.format("jdbc").option("url", "{}:{}/{}".format(domain,port,mysqldb)).option("driver", "com.mysql.jdbc.Driver").option("dbtable","{}".format(table)).option("user", "{}".format(username)).option("password", "{}".format(password)).load()

df.registerTempTable('mytempTable')

sqlContext.sql("create table {}.`{}` stored as parquet as select * from mytempTable".format(hivedb,table))


I am able to query the records and get I am getting the expected results.

Now I am also able to append the hive table using like below

 

#Get last value from hive table
lastval = sqlContext.sql("select nvl(max(id),0) as maxval from {}.{}".format(hivedb,table)).collect()[0].asDict()['maxval']
    
#Connect to FDS and import records greater than the lastval
df = sqlContext.read.format("jdbc").option("url", "{}:{}/{}".format(domain,port,mysqldb)).option("driver", "com.mysql.jdbc.Driver").option("dbtable","(select * from {} where id > {}) as {}".format(table,lastval,table)).option("user", "{}".format(username)).option("password", "{}".format(password)).load()

#register dataframe as temptable
df.registerTempTable("mytempTable")

#Insert into hive table
sqlContext.sql("insert into table {}.{} select * from mytempTable".format(hivedb,table))

I am getting the expected results. All is fine until now.

 

But when a other user is trying to update the same `hive` table using the same script I am using we are getting `access control exception`

 

the error message is below:

org.apache.hadoop.security.AccessControlException: Permission denied: user=xxxxx, access=WRITE, path="/user/hive/warehouse/testing.db/table_name/_temporary/0"

Why is this happening and what should I do to not get this type of errors.