Support Questions
Find answers, ask questions, and share your expertise

Not able crate a table from hive on druid data sources .

Highlighted

Not able crate a table from hive on druid data sources .

Explorer

2: jdbc:hive2://devhcdl2.azure.ril.com:2181,d> CREATE EXTERNAL TABLE druid_table STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ("druid.datasource" = "csvexample"); INFO : Compiling command(queryId=hive_20190124083056_e115ce4b-8009-41f0-8a9d-5f140e9292bc): CREATE EXTERNAL TABLE druid_table STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ("druid.datasource" = "csvexample") INFO : Semantic Analysis Completed (retrial = false) INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) INFO : Completed compiling command(queryId=hive_20190124083056_e115ce4b-8009-41f0-8a9d-5f140e9292bc); Time taken: 0.043 seconds INFO : Executing command(queryId=hive_20190124083056_e115ce4b-8009-41f0-8a9d-5f140e9292bc): CREATE EXTERNAL TABLE druid_table STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ("druid.datasource" = "csvexample") INFO : Starting task [Stage-0:DDL] in serial mode ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hive.druid.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of `java.util.ArrayList` out of START_OBJECT token at [Source: (org.apache.hive.druid.com.metamx.http.client.io.AppendableByteArrayInputStream); line: -1, column: 4] at org.apache.hive.druid.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63) at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1342) at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1138) at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1092) at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.handleNonArray(CollectionDeserializer.java:332) at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:265) at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245) at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:27) at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4001) at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3065) at org.apache.hadoop.hive.druid.serde.DruidSerDe.submitMetadataRequest(DruidSerDe.java:266) at org.apache.hadoop.hive.druid.serde.DruidSerDe.initFromMetaDataQuery(DruidSerDe.java:164) at org.apache.hadoop.hive.druid.serde.DruidSerDe.initialize(DruidSerDe.java:130) at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54) at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:540) at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:90) at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:294) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:276) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:974) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1017) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4964) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:395) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:210) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2701) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2372) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2048) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1746) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1740) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:318) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:331) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)