<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Stuck as Hive- DruidIntegration in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208148#M170107</link>
    <description>&lt;P style="margin-left: 40px;"&gt;Thanks &lt;A rel="user" href="https://community.cloudera.com/users/294/ghagleitner.html" nodeid="294"&gt;@ghagleitner&lt;/A&gt; and &lt;A rel="user" href="https://community.cloudera.com/users/12341/sbouguerra.html" nodeid="12341"&gt;@Slim&lt;/A&gt; for your responses.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;Actually I was able to identify that and resolve the issue.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;I am stuck at a different point now.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;I have created a data source through batch loading script (written in Json). &lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;I am able to query that data source through Json Query Script and could see all required columns.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;But then I created an External table in Hive on top of that Druid data source using below syntax :&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;&lt;CODE&gt;CREATE EXTERNAL TABLE druid_table_1&lt;/CODE&gt;
&lt;CODE&gt;STORED BY &lt;/CODE&gt;&lt;CODE&gt;'org.apache.hadoop.hive.druid.DruidStorageHandler'&lt;/CODE&gt;
&lt;CODE&gt;TBLPROPERTIES (&lt;/CODE&gt;&lt;CODE&gt;"druid.datasource"&lt;/CODE&gt; &lt;CODE&gt;= &lt;/CODE&gt;&lt;CODE&gt;"MSCS3KA"&lt;/CODE&gt;&lt;CODE&gt;);&lt;/CODE&gt;&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;Here I can see the table has been created with proper structure.. But when I tried selecting data from it, it is showing values for only the '__time' column.. Rest all columns are showing NULL values ! Do you have any idea about this issue?&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;Please let me know if you have any clue.&lt;/P&gt;</description>
    <pubDate>Mon, 04 Dec 2017 14:58:46 GMT</pubDate>
    <dc:creator>deysandip81</dc:creator>
    <dc:date>2017-12-04T14:58:46Z</dc:date>
    <item>
      <title>Stuck as Hive- DruidIntegration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208145#M170104</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;I am working on a cluster which is running on HDP 2.6 and Hive version id 2.1.&lt;/P&gt;&lt;P&gt;I logged into Hiveserver2 Interactive from Beeline and tried to create a Druid table from Hive and encountered an issue.&lt;/P&gt;&lt;P&gt;Below are the steps done :&lt;/P&gt;&lt;P&gt;SET hive.druid.broker.address.default=&amp;lt;Broker_IP&amp;gt;:8082;&lt;/P&gt;&lt;P&gt;CREATE TABLE druid_table_1&lt;BR /&gt;STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'&lt;BR /&gt;TBLPROPERTIES ("druid.datasource" = "DatasourceName");&lt;/P&gt;&lt;P&gt;It failed and below is the trace from hive logs :&lt;/P&gt;&lt;P&gt;2017-11-23T12:30:25,543  INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,543  INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,543  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,544  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Compiling command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02): CREATE TABLE druid_table_1&lt;BR /&gt;STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'&lt;BR /&gt;TBLPROPERTIES ("druid.datasource" = "MSCS3KA")&lt;BR /&gt;2017-11-23T12:30:25,545  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: We are setting the hadoop caller context from HIVE_SSN_ID:23c7f013-ef18-4ac1-be64-9903695a115d to hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02&lt;BR /&gt;2017-11-23T12:30:25,546  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] parse.CalcitePlanner: Starting Semantic Analysis&lt;BR /&gt;2017-11-23T12:30:25,546  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] parse.CalcitePlanner: Creating table default.druid_table_1 position=13&lt;BR /&gt;2017-11-23T12:30:25,547  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] metastore.HiveMetaStore: 1: get_database: default&lt;BR /&gt;2017-11-23T12:30:25,547  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] HiveMetaStore.audit: ugi=hive ip=unknown-ip-addr  cmd=get_database: default&lt;BR /&gt;2017-11-23T12:30:25,555  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Semantic Analysis Completed&lt;BR /&gt;2017-11-23T12:30:25,555  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)&lt;BR /&gt;2017-11-23T12:30:25,556  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: Completed compiling command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02); Time taken: 0.011 seconds&lt;BR /&gt;2017-11-23T12:30:25,556  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] ql.Driver: We are resetting the hadoop caller context to HIVE_SSN_ID:23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,556  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,556  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to  HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,557  INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,557  INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Concurrency mode is disabled, not creating a lock manager&lt;BR /&gt;2017-11-23T12:30:25,558  INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Setting caller context to query id hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02&lt;BR /&gt;2017-11-23T12:30:25,558  INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Executing command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02): CREATE TABLE druid_table_1&lt;BR /&gt;STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'&lt;BR /&gt;TBLPROPERTIES ("druid.datasource" = "MSCS3KA")&lt;BR /&gt;2017-11-23T12:30:25,559  INFO [HiveServer2-Background-Pool: Thread-112] hooks.ATSHook: Created ATS Hook&lt;BR /&gt;2017-11-23T12:30:25,559  INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Starting task [Stage-0:DDL] in serial mode&lt;BR /&gt;2017-11-23T12:30:25,560 ERROR [HiveServer2-Background-Pool: Thread-112] exec.DDLTask: java.lang.reflect.InvocationTargetException&lt;BR /&gt;  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;BR /&gt;  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;  at java.lang.reflect.Constructor.newInstance(Constructor.java:422)&lt;BR /&gt;  at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:132)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:306)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:290)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:703)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4234)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:350)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1987)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1667)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1414)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1211)&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1204)&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:242)&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:336)&lt;BR /&gt;  at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;  at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:350)&lt;BR /&gt;  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)&lt;BR /&gt;  at java.util.concurrent.FutureTask.run(FutureTask.java:266)&lt;BR /&gt;  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)&lt;BR /&gt;  at java.util.concurrent.FutureTask.run(FutureTask.java:266)&lt;BR /&gt;  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)&lt;BR /&gt;  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)&lt;BR /&gt;  at java.lang.Thread.run(Thread.java:745)&lt;BR /&gt;Caused by: java.lang.IllegalStateException: Unknown metadata storage type [derby]&lt;BR /&gt;  at org.apache.hadoop.hive.druid.DruidStorageHandler.&amp;lt;init&amp;gt;(DruidStorageHandler.java:170)&lt;BR /&gt;  ... 31 more&lt;BR /&gt;&lt;BR /&gt;2017-11-23T12:30:25,560  INFO [HiveServer2-Background-Pool: Thread-112] hooks.ATSHook: Created ATS Hook&lt;BR /&gt;2017-11-23T12:30:25,561 ERROR [HiveServer2-Background-Pool: Thread-112] ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. null&lt;BR /&gt;2017-11-23T12:30:25,561  INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Resetting the caller context to HIVE_SSN_ID:23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,561  INFO [HiveServer2-Background-Pool: Thread-112] ql.Driver: Completed executing command(queryId=hive_20171123123025_738ad9fa-6b25-447d-bade-5d873d570b02); Time taken: 0.003 seconds&lt;BR /&gt;2017-11-23T12:30:25,567 ERROR [HiveServer2-Background-Pool: Thread-112] operation.Operation: Error running hive query:&lt;BR /&gt;org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. null&lt;BR /&gt;  at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:376) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:244) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:336) [hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_60]&lt;BR /&gt;  at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_60]&lt;BR /&gt;  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) [hadoop-common-2.7.3.2.6.3.0-235.jar:?]&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:350) [hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_60]&lt;BR /&gt;  at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_60]&lt;BR /&gt;  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_60]&lt;BR /&gt;  at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_60]&lt;BR /&gt;  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_60]&lt;BR /&gt;  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_60]&lt;BR /&gt;  at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]&lt;BR /&gt;Caused by: java.lang.reflect.InvocationTargetException&lt;BR /&gt;  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_60]&lt;BR /&gt;  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_60]&lt;BR /&gt;  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_60]&lt;BR /&gt;  at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[?:1.8.0_60]&lt;BR /&gt;  at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:132) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:306) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:290) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:703) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4234) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:350) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1987) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1667) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1414) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1211) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1204) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:242) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  ... 13 more&lt;BR /&gt;Caused by: java.lang.IllegalStateException: Unknown metadata storage type [derby]&lt;BR /&gt;  at org.apache.hadoop.hive.druid.DruidStorageHandler.&amp;lt;init&amp;gt;(DruidStorageHandler.java:170) ~[hive-druid-handler-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_60]&lt;BR /&gt;  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_60]&lt;BR /&gt;  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_60]&lt;BR /&gt;  at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[?:1.8.0_60]&lt;BR /&gt;  at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:132) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:306) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:290) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:703) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4234) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:350) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1987) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1667) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1414) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1211) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1204) ~[hive-exec-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:242) ~[hive-service-2.1.0.2.6.3.0-235.jar:2.1.0.2.6.3.0-235]&lt;BR /&gt;  ... 13 more&lt;BR /&gt;2017-11-23T12:30:25,571  INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,571  INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,571  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,572  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,572  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to  HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,572  INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to  HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,573  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,574  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,574  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to  HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,574  INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,574  INFO [HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,574  INFO [HiveServer2-Handler-Pool: Thread-61] session.SessionState: Updating thread name to 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,574  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are setting the hadoop caller context to 23c7f013-ef18-4ac1-be64-9903695a115d for thread 23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,574  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] conf.HiveConf: Using the default value passed in for log id: 23c7f013-ef18-4ac1-be64-9903695a115d&lt;BR /&gt;2017-11-23T12:30:25,575  INFO [23c7f013-ef18-4ac1-be64-9903695a115d HiveServer2-Handler-Pool: Thread-61] session.SessionState: Resetting thread name to  HiveServer2-Handler-Pool: Thread-61&lt;BR /&gt;2017-11-23T12:30:25,575  INFO [HiveServer2-Handler-Pool: Thread-61] session.HiveSessionImpl: We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-61 &lt;/P&gt;&lt;P&gt;If you have any idea, please let me know&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 12:33:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208145#M170104</guid>
      <dc:creator>deysandip81</dc:creator>
      <dc:date>2022-09-16T12:33:56Z</dc:date>
    </item>
    <item>
      <title>Re: Stuck as Hive- DruidIntegration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208146#M170105</link>
      <description>&lt;P&gt;&amp;gt; Unknown metadata storage type [derby]&lt;/P&gt;&lt;P&gt;Seems you're using derby for the druid catalog. The druid handler only supports mysql and postgres at the moment.&lt;/P&gt;</description>
      <pubDate>Fri, 01 Dec 2017 01:38:45 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208146#M170105</guid>
      <dc:creator>ghagleitner</dc:creator>
      <dc:date>2017-12-01T01:38:45Z</dc:date>
    </item>
    <item>
      <title>Re: Stuck as Hive- DruidIntegration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208147#M170106</link>
      <description>&lt;P&gt;FYI Derby is a local instance DB used only for testing. For production please use Mysql or Postgres.  &lt;/P&gt;</description>
      <pubDate>Fri, 01 Dec 2017 02:00:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208147#M170106</guid>
      <dc:creator>sbouguerra</dc:creator>
      <dc:date>2017-12-01T02:00:44Z</dc:date>
    </item>
    <item>
      <title>Re: Stuck as Hive- DruidIntegration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208148#M170107</link>
      <description>&lt;P style="margin-left: 40px;"&gt;Thanks &lt;A rel="user" href="https://community.cloudera.com/users/294/ghagleitner.html" nodeid="294"&gt;@ghagleitner&lt;/A&gt; and &lt;A rel="user" href="https://community.cloudera.com/users/12341/sbouguerra.html" nodeid="12341"&gt;@Slim&lt;/A&gt; for your responses.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;Actually I was able to identify that and resolve the issue.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;I am stuck at a different point now.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;I have created a data source through batch loading script (written in Json). &lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;I am able to query that data source through Json Query Script and could see all required columns.&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;But then I created an External table in Hive on top of that Druid data source using below syntax :&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;&lt;CODE&gt;CREATE EXTERNAL TABLE druid_table_1&lt;/CODE&gt;
&lt;CODE&gt;STORED BY &lt;/CODE&gt;&lt;CODE&gt;'org.apache.hadoop.hive.druid.DruidStorageHandler'&lt;/CODE&gt;
&lt;CODE&gt;TBLPROPERTIES (&lt;/CODE&gt;&lt;CODE&gt;"druid.datasource"&lt;/CODE&gt; &lt;CODE&gt;= &lt;/CODE&gt;&lt;CODE&gt;"MSCS3KA"&lt;/CODE&gt;&lt;CODE&gt;);&lt;/CODE&gt;&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;Here I can see the table has been created with proper structure.. But when I tried selecting data from it, it is showing values for only the '__time' column.. Rest all columns are showing NULL values ! Do you have any idea about this issue?&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;Please let me know if you have any clue.&lt;/P&gt;</description>
      <pubDate>Mon, 04 Dec 2017 14:58:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208148#M170107</guid>
      <dc:creator>deysandip81</dc:creator>
      <dc:date>2017-12-04T14:58:46Z</dc:date>
    </item>
    <item>
      <title>Re: Stuck as Hive- DruidIntegration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208149#M170108</link>
      <description>&lt;P&gt;can you create a new issue and attach the logs. it is hard to see what is going on without logs. One way to check this is to first issue an explain query that will show you the druid query then you can copy that query and try it your self via curl command.&lt;/P&gt;</description>
      <pubDate>Wed, 06 Dec 2017 06:50:33 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208149#M170108</guid>
      <dc:creator>sbouguerra</dc:creator>
      <dc:date>2017-12-06T06:50:33Z</dc:date>
    </item>
    <item>
      <title>Re: Stuck as Hive- DruidIntegration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208150#M170109</link>
      <description>&lt;P&gt;Hive is NoSQL but it requires that all files residing in your hdfs location for your table have the same layout.&lt;/P&gt;</description>
      <pubDate>Wed, 13 Mar 2019 14:49:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Stuck-as-Hive-DruidIntegration/m-p/208150#M170109</guid>
      <dc:creator>samuel_peeters</dc:creator>
      <dc:date>2019-03-13T14:49:47Z</dc:date>
    </item>
  </channel>
</rss>

