Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Druid Hive Integration

Explorer

Getting below error when creating a table from hive with druid integration

/usr/hdp/2.6.3.0-235/hive/bin/hive.distro: line 106: [: /usr/hdp/2.6.3.0-235/hive/lib/hive-metastore-1.2.1000.2.6.3.0-235.jar: binary operator expected log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. Logging initialized using configuration in file:/etc/hive/2.6.3.0-235/0/hive-log4j.properties hive> CREATE TABLE druid_table_1 > (`time` TIMESTAMP, `dimension1` STRING) > STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'; Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hive.ql.session.SessionState$LogHelper.<init>(Lorg/slf4j/Logger;)V at org.apache.hadoop.hive.druid.DruidStorageHandler.<clinit>(DruidStorageHandler.java:93) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.hadoop.hive.ql.parse.ParseUtils.ensureClassExists(ParseUtils.java:230) at org.apache.hadoop.hive.ql.parse.StorageFormat.fillStorageFormat(StorageFormat.java:64) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:11210) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10391) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10477) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:219) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:465) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:321) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1224) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1265) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1161) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1151) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:217) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:169) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:380) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:740) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:685) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:233) at org.apache.hadoop.util.RunJar.main(RunJar.java:148)

Please help--

7 REPLIES 7

Explorer

Can somebody please help in resolving the issue.

Rising Star
@Sateesh Battu

please check if the hive.druid.broker.address.default propert is set . in HIVE confs

Explorer

@DILEEP KUMAR CHIGURUVADA

Hi Dileep, this property was already set. Still issue exist.

Explorer

@Dileep Kumar Chiguruvada When i set this property from Hive Client, throwing below error.But in Ambari Hive confs, this property was already set.

SET hive.druid.broker.address.default=localhost:8082;

/usr/hdp/2.6.3.0-235/hive/bin/hive.distro: line 106: [: /usr/hdp/2.6.3.0-235/hive/lib/hive-metastore-1.2.1000.2.6.3.0-235.jar: binary operator expected log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. Logging initialized using configuration in file:/etc/hive/2.6.3.0-235/0/hive-log4j.properties

hive> SET hive.druid.broker.address.default=localhost:8082;

Query returned non-zero code: 1, cause: hive configuration hive.druid.broker.address.default does not exists.

hive>


Please help--

Explorer

@DILEEP KUMAR CHIGURUVADA

Hi Dileep, this property was already set. Still issue exist.

Explorer

@Sateesh Battu,

Looks like you are having conflict with your logging libraries. Below is the main reason for failure.

make sure you are connecting to hive-server2 and have correct logging jars in the classpath.

java.lang.NoSuchMethodError: org.apache.hadoop.hive.ql.session.SessionState$LogHelper.<init>(Lorg/slf4j/Logger;)V at org.apache.hadoop.hive.druid.DruidStorageHandler.<clinit>(DruidStorageHandler.java:93)

Explorer

@Nishant Bangarwa. Can you let me know which library should retained . Is it hive-exec jar ? What are other possible jars this method exist and should be removed.

In simple, how to identify with which jars conflict is happening.

We are connecting to hive-server2.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.