Support Questions

Find answers, ask questions, and share your expertise

Druid on Hive LLAP - HDP2.6.1

avatar
Explorer
Hi,
I am creating a table in Hive (LLAP) using an existing table. as follows:

CREATE TABLE test_druid STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' 
TBLPROPERTIES (   "druid.datasource" = "test_druid",   "druid.segment.granularity" = "MONTH",   "druid.query.granularity" = "DAY") 
as  select cast(trans_date as timestamp) as `__time` , col1, col2, col3 from testdb.test_hive_Table where to_date(trans_Date) >= '2018-01-01';

I am getting following error. No where i mentioned the druid storage as "/druid/segments". I dont know from where it is picking it. 

In Ambari, i have set druid.storage.storageDirectory=/user/druid/data. Not sure what is causing the issue.
Please help.

	Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/druid/segments/ff7e385a0fbf4d84973be64d88f31c02":hdfs:hdfs:drwxr-xr-x
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
        at org.apache.hadoop.hdfs.server.namenode.FS

<br>
1 ACCEPTED SOLUTION

avatar
Expert Contributor

It is coming from Hive conf as default. You can set it from Ambari to make it global for all sessions

hive.druid.storage.storageDirectory

https://github.com/b-slim/hive/blob/f8bc4868eced2ca83113579b626e279bbe6d5b13/common/src/java/org/apa...

View solution in original post

2 REPLIES 2

avatar
Expert Contributor

It is coming from Hive conf as default. You can set it from Ambari to make it global for all sessions

hive.druid.storage.storageDirectory

https://github.com/b-slim/hive/blob/f8bc4868eced2ca83113579b626e279bbe6d5b13/common/src/java/org/apa...

avatar
Explorer

Thank Slim, that worked.