Created on 05-02-2018 11:19 AM - edited 09-16-2022 06:10 AM
Hi, I am creating a table in Hive (LLAP) using an existing table. as follows: CREATE TABLE test_druid STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ( "druid.datasource" = "test_druid", "druid.segment.granularity" = "MONTH", "druid.query.granularity" = "DAY") as select cast(trans_date as timestamp) as `__time` , col1, col2, col3 from testdb.test_hive_Table where to_date(trans_Date) >= '2018-01-01'; I am getting following error. No where i mentioned the druid storage as "/druid/segments". I dont know from where it is picking it. In Ambari, i have set druid.storage.storageDirectory=/user/druid/data. Not sure what is causing the issue. Please help. Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/druid/segments/ff7e385a0fbf4d84973be64d88f31c02":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FS <br>
Created 05-02-2018 02:21 PM
It is coming from Hive conf as default. You can set it from Ambari to make it global for all sessions
hive.druid.storage.storageDirectory
Created 05-02-2018 02:21 PM
It is coming from Hive conf as default. You can set it from Ambari to make it global for all sessions
hive.druid.storage.storageDirectory
Created 05-03-2018 12:34 PM
Thank Slim, that worked.