Member since
09-26-2017
7
Posts
0
Kudos Received
0
Solutions
10-04-2017
07:08 PM
The issue has been resolved, I was creating a table without specifying in which database i want to create it. So, hive was trying to create the table in the default database to which i do not have permissions to write. All what I needed to do is to create a new database and use the command " USE database_name" before i actually create my table. thanks for the help guys, wasted too much time on configurations while the solution was done by a simple query.
... View more
09-27-2017
06:34 PM
yes I did. Please look at the log below 17/09/27 14:30:49 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/uat/06295/app/XTA0/hivedb').
17/09/27 14:30:49 INFO SharedState: Warehouse path is '/uat/06295/app/XTA0/hivedb'.
17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3d3238e4{/SQL,null,AVAILABLE,@Spark}
17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@53cfe9e5{/SQL/json,null,AVAILABLE,@Spark}
17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@75437b4b{/SQL/execution,null,AVAILABLE,@Spark}
17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7e94c1d5{/SQL/execution/json,null,AVAILABLE,@Spark}
17/09/27 14:30:49 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@51b0451f{/static/sql,null,AVAILABLE,@Spark}
17/09/27 14:30:49 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
17/09/27 14:30:49 INFO HiveClientImpl: Attempting to login to Kerberos using principal: and keytab:
17/09/27 14:30:49 INFO UserGroupInformation: Login successful for user using keytab file
17/09/27 14:30:50 INFO metastore: Trying to connect to metastore with URI
17/09/27 14:30:50 INFO metastore: Connected to metastore.
17/09/27 14:30:53 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
17/09/27 14:30:53 INFO SessionState: Created local directory: /tmp/369e4278-1d5e-46e6-bc5d-b7143e48c525_resources
17/09/27 14:30:53 INFO SessionState: Created HDFS directory: /tmp/hive/
17/09/27 14:30:53 INFO SessionState: Created local directory: /tmp/
17/09/27 14:30:53 INFO SessionState: Created HDFS directory: /tmp/hive/
17/09/27 14:30:53 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /uat/06295/app/XTA0/hivedb
17/09/27 14:30:53 INFO Main: Created Spark session with Hive support
... View more
09-26-2017
08:10 PM
Thanks @Aditya Sirna I tried that and I am still getting the same exception Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: user=USER, access=WRITE, inode="/apps/hive/warehouse/position_parquet_test/_temporary/0":hive:hdfs:drwxr-xr-x
... View more
09-26-2017
07:00 PM
Hi @Aditya Sirna I am including it in spark conf as the following spark.sql.warehouse.dir=hdfs:///uat/06295/app/XTA0/hivedb spark.yarn.security.credentials.hive.enabled=true But spark it does not use it as hive warehouse Thanks Akrem
... View more
09-26-2017
05:35 PM
I am using spark 2.1.1 and hive.metastore.warehouse.dir is deprecated on version 2.1.1
... View more
09-26-2017
01:46 PM
I added spark.sql.warehouse.dir in the configuration but still spark is not using it. it still points to /apps/hive/warehouse. I even changed the settings in hive-site.xml but I am still having the same issue. I keep getting this exception org.apache.hadoop.ipc.RemoteException: Permission denied: user=user, access=WRITE, inode="/apps/hive/warehouse/position_parquet_test/_temporary/0":hive:hdfs:drwxr-xr-x update: once i start spark shell it shows in the info that the warehouse was set to the right path 17/09/27 13:16:46 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /uat/06295/app/XTA0/hivedb how ever when I try to create a table i get the exception mentioned above
... View more
Labels:
- Labels:
-
Apache Spark