Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How can I change hive warehouse directory from /apps/hive/warehouse/ to another warehouse that i have permission to write to ?

avatar

I added spark.sql.warehouse.dir in the configuration but still spark is not using it. it still points to /apps/hive/warehouse. I even changed the settings in hive-site.xml but I am still having the same issue. I keep getting this exception

org.apache.hadoop.ipc.RemoteException: Permission denied: user=user, access=WRITE, inode="/apps/hive/warehouse/position_parquet_test/_temporary/0":hive:hdfs:drwxr-xr-x

update:

once i start spark shell it shows in the info that the warehouse was set to the right path

17/09/27 13:16:46 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /uat/06295/app/XTA0/hivedb

how ever when I try to create a table i get the exception mentioned above

11 REPLIES 11

avatar

The property you want to change is "hive.metastore.warehouse.dir".

avatar

I am using spark 2.1.1 and hive.metastore.warehouse.dir is deprecated on version 2.1.1

avatar

Have you given that a try, setting hive.metastore.warehouse.dir in hive-site.xml under spark conf.

avatar

yes I did. Please look at the log below

17/09/27 14:30:49 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/uat/06295/app/XTA0/hivedb'). 17/09/27 14:30:49 INFO SharedState: Warehouse path is '/uat/06295/app/XTA0/hivedb'. 17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3d3238e4{/SQL,null,AVAILABLE,@Spark} 17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@53cfe9e5{/SQL/json,null,AVAILABLE,@Spark} 17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@75437b4b{/SQL/execution,null,AVAILABLE,@Spark} 17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7e94c1d5{/SQL/execution/json,null,AVAILABLE,@Spark} 17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@51b0451f{/static/sql,null,AVAILABLE,@Spark} 17/09/27 14:30:49 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 17/09/27 14:30:49 INFO HiveClientImpl: Attempting to login to Kerberos using principal: and keytab: 17/09/27 14:30:49 INFO UserGroupInformation: Login successful for user using keytab file 17/09/27 14:30:50 INFO metastore: Trying to connect to metastore with URI 17/09/27 14:30:50 INFO metastore: Connected to metastore. 17/09/27 14:30:53 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 17/09/27 14:30:53 INFO SessionState: Created local directory: /tmp/369e4278-1d5e-46e6-bc5d-b7143e48c525_resources 17/09/27 14:30:53 INFO SessionState: Created HDFS directory: /tmp/hive/ 17/09/27 14:30:53 INFO SessionState: Created local directory: /tmp/ 17/09/27 14:30:53 INFO SessionState: Created HDFS directory: /tmp/hive/ 17/09/27 14:30:53 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /uat/06295/app/XTA0/hivedb 17/09/27 14:30:53 INFO Main: Created Spark session with Hive support

avatar
Super Guru

Hi @Akrem Latiwesh,

Try setting spark.sql.warehouse.dir in spark conf.

Thanks,

Aditya

avatar

Hi @Aditya Sirna

I am including it in spark conf as the following

spark.sql.warehouse.dir=hdfs:///uat/06295/app/XTA0/hivedb

spark.yarn.security.credentials.hive.enabled=true

But spark it does not use it as hive warehouse

Thanks

Akrem

avatar
Super Guru

Hi @Akrem Latiwesh,

Can you just try giving "/app/XTA0/hivedb" instead of hdfs:/// . It will pick the defaultFS from hadoop's core-site.xml.

Thanks,

Aditya

avatar

Thanks @Aditya Sirna

I tried that and I am still getting the same exception

Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: user=USER, access=WRITE, inode="/apps/hive/warehouse/position_parquet_test/_temporary/0":hive:hdfs:drwxr-xr-x

avatar

The issue has been resolved, I was creating a table without specifying in which database i want to create it. So, hive was trying to create the table in the default database to which i do not have permissions to write.

All what I needed to do is to create a new database and use the command " USE database_name" before i actually create my table.

thanks for the help guys, wasted too much time on configurations while the solution was done by a simple query.