Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Stuck as Hive- DruidIntegration

avatar
New Contributor

Hi All,

I am working on a cluster which is running on HDP 2.6 and Hive version id 2.1.

I logged into Hiveserver2 Interactive from Beeline and tried to create a Druid table from Hive and encountered an issue.

Below are the steps done :

SET hive.druid.broker.address.default=<Broker_IP>:8082;

CREATE TABLE druid_table_1
STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'
TBLPROPERTIES ("druid.datasource" = "DatasourceName");

It failed and below is the trace from hive logs :

2017-11-23T12:30:25,543 INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,543 INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,543 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,544 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Compiling command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02): CREATE TABLE druid_table_1
STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'
TBLPROPERTIES ("druid.datasource" = "MSCS3KA")
2017-11-23T12:30:25,545 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: We are setting the hadoop caller context from HIVE_SSN_ID:23c7f013-ef18-4ac1-be64-9903695a115d to hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02
2017-11-23T12:30:25,546 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] parse.CalcitePlanner: Starting Semantic Analysis
2017-11-23T12:30:25,546 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] parse.CalcitePlanner: Creating table default.druid_table_1 position=13
2017-11-23T12:30:25,547 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] metastore.HiveMetaStore: 1: get_database: default
2017-11-23T12:30:25,547 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] HiveMetaStore.audit: ugi=hive ip=unknown-ip-addr cmd=get_database: default
2017-11-23T12:30:25,555 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Semantic Analysis Completed
2017-11-23T12:30:25,555 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
2017-11-23T12:30:25,556 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Completed compiling command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02); Time taken: 0.011 seconds
2017-11-23T12:30:25,556 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: We are resetting the hadoop caller context to HIVE_SSN_ID:23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,556 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,556 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,557 INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,557 INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Concurrency mode is disabled, not creating a lock manager
2017-11-23T12:30:25,558 INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Setting caller context to query id hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02
2017-11-23T12:30:25,558 INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Executing command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02): CREATE TABLE druid_table_1
STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'
TBLPROPERTIES ("druid.datasource" = "MSCS3KA")
2017-11-23T12:30:25,559 INFO [HiveServer2-Background-Pool: Thread-112] hooks.ATSHook: Created ATS Hook
2017-11-23T12:30:25,559 INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Starting task [Stage-0:DDL] in serial mode
2017-11-23T12:30:25,560 ERROR [HiveServer2-Background-Pool: Thread-112] exec.DDLTask: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:132)
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:306)
at org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:290)
at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:703)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4234)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:350)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1987)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1667)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1414)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1211)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1204)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:242)
at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:336)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:350)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Unknown metadata storage type [derby]
at org.apache.hadoop.hive.druid.DruidStorageHandler.<init>(DruidStorageHandler.java:170)
... 31 more

2017-11-23T12:30:25,560 INFO [HiveServer2-Background-Pool: Thread-112] hooks.ATSHook: Created ATS Hook
2017-11-23T12:30:25,561 ERROR [HiveServer2-Background-Pool: Thread-112] ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. null
2017-11-23T12:30:25,561 INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Resetting the caller context to HIVE_SSN_ID:23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,561 INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Completed executing command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02); Time taken: 0.003 seconds
2017-11-23T12:30:25,567 ERROR [HiveServer2-Background-Pool: Thread-112] operation.Operation: Error running hive query:
org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. null
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:376) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:244) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:336) [hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_60]
at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_60]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) [hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:350) [hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_60]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_60]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_60]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_60]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_60]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_60]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_60]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_60]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_60]
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[?:1.8.0_60]
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:132) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:306) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:290) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:703) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4234) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:350) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1987) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1667) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1414) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1211) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1204) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:242) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
... 13 more
Caused by: java.lang.IllegalStateException: Unknown metadata storage type [derby]
at org.apache.hadoop.hive.druid.DruidStorageHandler.<init>(DruidStorageHandler.java:170) ~[hive-druid-handler-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_60]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_60]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_60]
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[?:1.8.0_60]
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:132) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:306) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:290) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:703) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4234) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:350) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1987) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1667) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1414) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1211) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1204) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:242) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]
... 13 more
2017-11-23T12:30:25,571 INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,571 INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,571 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,572 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,572 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,572 INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,573 INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,573 INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,573 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,573 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,573 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,573 INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,573 INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,573 INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,573 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,574 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,574 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,574 INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,574 INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,574 INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,574 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,574 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d
2017-11-23T12:30:25,575 INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-61
2017-11-23T12:30:25,575 INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61

If you have any idea, please let me know

1 ACCEPTED SOLUTION

avatar
Contributor

> Unknown metadata storage type [derby]

Seems you're using derby for the druid catalog. The druid handler only supports mysql and postgres at the moment.

View solution in original post

5 REPLIES 5

avatar
Contributor

> Unknown metadata storage type [derby]

Seems you're using derby for the druid catalog. The druid handler only supports mysql and postgres at the moment.

avatar
Expert Contributor

FYI Derby is a local instance DB used only for testing. For production please use Mysql or Postgres.

avatar
New Contributor

Thanks @ghagleitner and @Slim for your responses.

Actually I was able to identify that and resolve the issue.

I am stuck at a different point now.

I have created a data source through batch loading script (written in Json).

I am able to query that data source through Json Query Script and could see all required columns.

But then I created an External table in Hive on top of that Druid data source using below syntax :

CREATE EXTERNAL TABLE druid_table_1 STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ("druid.datasource" = "MSCS3KA");

Here I can see the table has been created with proper structure.. But when I tried selecting data from it, it is showing values for only the '__time' column.. Rest all columns are showing NULL values ! Do you have any idea about this issue?

Please let me know if you have any clue.

avatar
Expert Contributor

can you create a new issue and attach the logs. it is hard to see what is going on without logs. One way to check this is to first issue an explain query that will show you the druid query then you can copy that query and try it your self via curl command.

avatar
Contributor

Hive is NoSQL but it requires that all files residing in your hdfs location for your table have the same layout.