Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hive CLI not launched

avatar

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12 -1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/spark/lib/spark-assembly -1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar!/org/slf4j/impl/StaticLoggerBin der.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] WARNING: Use "yarn jar" to launch YARN applications. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12 -1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/spark/lib/spark-assembly -1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar!/org/slf4j/impl/StaticLoggerBin der.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Logging initialized using configuration in file:/etc/hive/2.3.2.0-2950/0/hive-log4j.properties

1 ACCEPTED SOLUTION

avatar
Super Guru
@ravi kumar

First check whether HDFS and YARN services are in healthy state in the cluster. If everything looks fine then try enabbling hive debug and review for more debug logs for any possible issue.

hive --hiveconf hive.root.logger=DEBUG,console

View solution in original post

8 REPLIES 8

avatar
Super Guru
@ravi kumar

First check whether HDFS and YARN services are in healthy state in the cluster. If everything looks fine then try enabbling hive debug and review for more debug logs for any possible issue.

hive --hiveconf hive.root.logger=DEBUG,console

avatar

It's working ,but still there is big process runs on console while iam run a simple query

java.io.FileNotFoundException: /etc/security/clientKeys/all.jks (No such file or directory) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.<init>(FileInputStream.java:146) at org.apache.hadoop.security.ssl.ReloadingX509TrustManager.loadTrustManager(ReloadingX509TrustManager.java:164) at org.apache.hadoop.security.ssl.ReloadingX509TrustManager.<init>(ReloadingX509TrustManager.java:81) at org.apache.hadoop.security.ssl.FileBasedKeyStoresFactory.init(FileBasedKeyStoresFactory.java:209) at org.apache.hadoop.security.ssl.SSLFactory.init(SSLFactory.java:131) at org.apache.atlas.security.SecureClientUtils.newSslConnConfigurator(SecureClientUtils.java:151) at org.apache.atlas.security.SecureClientUtils.newConnConfigurator(SecureClientUtils.java:137) at org.apache.atlas.security.SecureClientUtils.getClientConnectionHandler(SecureClientUtils.java:70) at org.apache.atlas.AtlasClient.<init>(AtlasClient.java:103) at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.<init>(HiveMetaStoreBridge.java:81) at org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:179) at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:160) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 16/06/16 11:43:41 [main]: INFO security.SecureClientUtils: Real User: root (auth:SIMPLE), is from ticket cache? false 16/06/16 11:43:41 [main]: INFO security.SecureClientUtils: doAsUser: root 16/06/16 11:43:42 [main]: INFO log.PerfLogger: </PERFLOG method=PostHook.org.apache.atlas.hive.hook.HiveHook start=1466077421528 end=1466077422039 duration=511 from=org.apache.hadoop.hive.ql.Driver> 16/06/16 11:43:42 [main]: INFO metadata.Hive: Dumping metastore api call timing information for : execution phase 16/06/16 11:43:42 [main]: DEBUG metadata.Hive: Total time spent in each metastore function (ms): {} 16/06/16 11:43:42 [main]: INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1466077421390 end=1466077422040 duration=650 from=org.apache.hadoop.hive.ql.Driver> OK 16/06/16 11:43:42 [main]: INFO ql.Driver: OK 16/06/16 11:43:42 [main]: INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver> 16/06/16 11:43:42 [main]: DEBUG lockmgr.DbLockManager: Unlocking lockid:44 16/06/16 11:43:42 [main]: DEBUG lockmgr.DbLockManager: Removed a lock true 16/06/16 11:43:42 [main]: INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1466077422040 end=1466077422071 duration=31 from=org.apache.hadoop.hive.ql.Driver> 16/06/16 11:43:42 [main]: INFO log.PerfLogger: </PERFLOG method=Driver.run start=1466077420594 end=1466077422071 duration=1477 from=org.apache.hadoop.hive.ql.Driver> 16/06/16 11:43:42 [main]: DEBUG mapred.FileInputFormat: Time taken to get FileStatuses: 2 16/06/16 11:43:42 [main]: INFO mapred.FileInputFormat: Total input paths to process : 1 16/06/16 11:43:42 [main]: DEBUG mapred.FileInputFormat: Total # of splits generated by getSplits: 1, TimeTaken: 3 16/06/16 11:43:42 [main]: DEBUG exec.FetchOperator: Creating fetchTask with deserializer typeinfo: struct<col_name:string,data_type:string,comment:string> 16/06/16 11:43:42 [main]: DEBUG exec.FetchOperator: deserializer properties: table properties: {columns=col_name,data_type,comment, serialization.null.format= , serialization.lib=org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, serialization.format=9, columns.types=string:string:string} partition properties: {columns=col_name,data_type,comment, serialization.null.format= , serialization.lib=org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, serialization.format=9, columns.types=string:string:string} code string description string total_emp int salary int

avatar

After restart it's working..Thanks a lot jithu

avatar
Explorer

Hi Ravi - Can you tell me what did you restart - I see the same issue when i'm trying to launch hive.

Cannot load customized ssl related configuration. Fallback to system-generic settings java.io.FileNotFoundException: /etc/security/clientKeys/all.jks (No such file or directory)

avatar

launch hive shell in a new terminal

avatar
Explorer

I see the same issue - Do you have any ideas on how to fix that

Cannot load customized ssl related configuration. Fallback to system-generic settings java.io.FileNotFoundException: /etc/security/clientKeys/all.jks (No such file or directory)

avatar

launch hive shell in a new terminal ..

avatar
Explorer

I see the same issue - Do you have any ideas on how to fix that

Cannot load customized ssl related configuration. Fallback to system-generic settings java.io.FileNotFoundException: /etc/security/clientKeys/all.jks (No such file or directory)