Support Questions
Find answers, ask questions, and share your expertise

hplsql Not connecting to Hive Server

hplsql Not connecting to Hive Server

Explorer

I am using HDP 2.5.3

hplsql location : /usr/hdp/current/hive-server2-hive2/bin/hplsql

Edit :hplsql-site.xml

<property>
  <name>hplsql.conn.hive2conn</name>
  <value>org.apache.hive.jdbc.HiveDriver;jdbc:hive2://xx.xx.xx.xx:10000</value>
  <description>HiveServer2 JDBC connection</description>
</property>
<property>
  <name>hplsql.conn.init.hive2conn</name>
  <value>
     set hive.execution.engine=tez;
     use default;
  </value>
  <description>Statements for execute after connection to the database</description>
</property>

ERROR

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Unhandled exception in HPL/SQL
java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://xxxx.xxxx.xx:10000: Could not establish connection to jdbc:hive2://xxxx.xxxx.xx:10000: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{use:database=default})
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:209)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.hive.hplsql.Conn.openConnection(Conn.java:209)
at org.apache.hive.hplsql.Conn.getConnection(Conn.java:162)
at org.apache.hive.hplsql.Conn.executeQuery(Conn.java:58)
at org.apache.hive.hplsql.Exec.executeQuery(Exec.java:556)
at org.apache.hive.hplsql.Exec.executeQuery(Exec.java:565)
at org.apache.hive.hplsql.Select.select(Select.java:75)
at org.apache.hive.hplsql.Exec.visitSelect_stmt(Exec.java:1002)
at org.apache.hive.hplsql.Exec.visitSelect_stmt(Exec.java:52)
at org.apache.hive.hplsql.HplsqlParser$Select_stmtContext.accept(HplsqlParser.java:14768)
at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visitChildren(AbstractParseTreeVisitor.java:70)
at org.apache.hive.hplsql.Exec.visitStmt(Exec.java:994)
at org.apache.hive.hplsql.Exec.visitStmt(Exec.java:52)
at org.apache.hive.hplsql.HplsqlParser$StmtContext.accept(HplsqlParser.java:1012)
at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visitChildren(AbstractParseTreeVisitor.java:70)
at org.apache.hive.hplsql.HplsqlBaseVisitor.visitBlock(HplsqlBaseVisitor.java:28)
at org.apache.hive.hplsql.HplsqlParser$BlockContext.accept(HplsqlParser.java:446)
at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visitChildren(AbstractParseTreeVisitor.java:70)
at org.apache.hive.hplsql.Exec.visitProgram(Exec.java:901)
at org.apache.hive.hplsql.Exec.visitProgram(Exec.java:52)
at org.apache.hive.hplsql.HplsqlParser$ProgramContext.accept(HplsqlParser.java:389)
at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:42)
at org.apache.hive.hplsql.Exec.run(Exec.java:760)
at org.apache.hive.hplsql.Exec.run(Exec.java:736)
at org.apache.hive.hplsql.Hplsql.main(Hplsql.java:23)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.sql.SQLException: Could not establish connection to jdbc:hive2://xxxx.xxxx.xx:10000: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{use:database=default})
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:587)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:186)
... 33 more
Caused by: org.apache.thrift.TApplicationException: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{use:database=default})
at org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
at org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:168)
at org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:155)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:576)
... 34 more

Question:

1/ this error says there is a moss match in hive-jdbc driver : how to solve this issue ??

2/ In HDP there are many folders and config files

why we have many files and which one should i change or from where should I take Hadoop and hive_home?

ex

hive-site.xml

/etc/hive/2.5.3.0-37/0/hive-site.xml 
/etc/hive/conf.backup/hive-site.xml 
/etc/hive-hcatalog/2.5.3.0-37/0/proto-hive-site.xml 
/etc/hive-hcatalog/conf.backup/proto-hive-site.xml 
/etc/hive2/2.5.3.0-37/0/hive-site.xml 
/etc/hive2/conf.backup/hive-site.xml 
/etc/spark/2.5.3.0-37/0/hive-site.xml 
/etc/spark2/2.5.3.0-37/0/hive-site.xml 
/etc/spark2/conf.backup/hive-site.xml 
/etc/zeppelin/2.5.3.0-37/0/hive-site.xml 
/etc/zeppelin/conf.backup/hive-site.xml 
/usr/hdp/2.5.3.0-37/etc/hive/conf.dist/hive-site.xml 
/usr/hdp/2.5.3.0-37/etc/hive-hcatalog/conf.dist/proto-hive-site.xml 
/usr/hdp/2.5.3.0-37/etc/hive2/conf.dist/hive-site.xml 
/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml 
/var/lib/ambari-agent/cache/stacks/HDP/2.0.6.GlusterFS/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-agent/cache/stacks/HDP/2.1.GlusterFS/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-agent/cache/stacks/HDP/2.3.GlusterFS/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-server/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml 
/var/lib/ambari-server/resources/stacks/HDP/2.0.6.GlusterFS/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-server/resources/stacks/HDP/2.1/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-server/resources/stacks/HDP/2.1.GlusterFS/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-server/resources/stacks/HDP/2.2/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-server/resources/stacks/HDP/2.3/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-server/resources/stacks/HDP/2.3.GlusterFS/services/HIVE/configuration/hive-site.xml 
/var/lib/ambari-server/resources/stacks/HDP/2.5/services/HIVE/configuration/hive-site.xml   

3/ there are more jar and config files at /usr/hdp folder

a) current

b) 2.5.

c) share

which one holds current properties (current): is there any documents from HDP that I can refer for further use?