- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
HIVE2 JDBC connection: java.lang.NoSuchFieldError: org/apache/hadoop/util/PerformanceAdvisory.LOG
- Labels:
-
Apache Hive
Created ‎08-13-2018 06:33 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
As part of our project, I'm trying to connect HIVE tables via JDBC.
Code Snippet:
Connection con = DriverManager.getConnection("jdbc:hive2://hpchd1-zk-1.abc.xxxx.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2", "hive", "");
JARS USED:
hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar
hadoop-common-2.7.3.jar
When I tried to establish a connection, I'm getting the following exception. I am not sure what's causing it.Please shed some light on it.
Exception:
Caused by: java.lang.NoSuchFieldError: org/apache/hadoop/util/PerformanceAdvisory.LOG at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:41) ~[hadoop-common-2.7.3.jar:?] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:88) ~[?:1.8.0] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:57) ~[?:1.8.0] at java.lang.reflect.Constructor.newInstance(Constructor.java:437) ~[?:2.6 (10-13-2016)] at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:132) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.security.Groups.<init>(Groups.java:100) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.security.Groups.<init>(Groups.java:95) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:420) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:324) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:846) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:816) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge.createClientWithConf(HadoopThriftAuthBridge.java:87) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:55) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:484) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:242) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:206) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) ~[hive-jdbc-1.2.1000.2.6.5.0-292-standalone.jar:?] at java.sql.DriverManager.getConnection(DriverManager.java:675) ~[?:1.8.0] at java.sql.DriverManager.getConnection(DriverManager.java:258) ~[?:1.8.0] at com.pega.pegarules.data.internal.store.rdbms.JdbcConnectionManagerImpl.getJDBCUrlConnectionWithStatus(JdbcConnectionManagerImpl.java:2082) ~[prprivate.jar:?] at com.pega.pegarules.data.internal.store.rdbms.JdbcConnectionManagerImpl.getJDBCUrlConnection(JdbcConnectionManagerImpl.java:2031) ~[prprivate.jar:?] at com.pega.pegarules.data.internal.access.DatabaseImpl.testConnection(DatabaseImpl.java:3806) ~[prprivate.jar:?] at com.pega.pegarules.data.internal.access.DatabaseImpl.testConnection(DatabaseImpl.java:3777) ~[prprivate.jar:?] at com.pegarules.generated.activity.ra_action_testdbconnection_547349b48da96b477106433930f17c4d.step3_circum0(ra_action_testdbconnection_547349b48da96b477106433930f17c4d.java:329) ~[?:?] at com.pegarules.generated.activity.ra_action_testdbconnection_547349b48da96b477106433930f17c4d.perform(ra_action_testdbconnection_547349b48da96b477106433930f17c4d.java:113) ~[?:?] at com.pega.pegarules.session.internal.mgmt.Executable.doActivity(Executable.java:3597) ~[prprivate.jar:?] at com.pega.pegarules.session.internal.mgmt.base.ThreadRunner.runActivitiesAlt(ThreadRunner.java:646) ~[prprivate.jar:?] ... 51 more
Created ‎08-16-2018 07:36 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When using hive-jdbc-standalone*.jar, part from hadoop-common*.jar, below are the other dependent jars required:
ibthrift-0.9.0.jar httpclient-4.2.5.jar httpcore-4.2.5.jar commons-logging-1.1.3.jar hive-common.jar slf4j-api-1.7.5.jar hive-metastore.jar hive-service.jar hadoop-common.jar hive-jdbc.jar guava-11.0.2.jar
Please add the jars in the classpath under Client and try again.
Created ‎08-17-2018 11:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Sindhu. The issue was caused by missing libhadoop.so file path in the java.library.path. After adding that, I am able to avoid that exception.
<br> try{ org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration(); conf.set("hadoop.security.authentication", "Kerberos"); org.apache.hadoop.security.UserGroupInformation.setConfiguration(conf); org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab("hive/dspprxd@ABC.XXX.COM", "/tmp/dspprxd.keytab"); // load the driver Class.forName("org.apache.hive.jdbc.HiveDriver"); } catch (Exception e) { e.printStackTrace(); } try { Connection con = DriverManager.getConnection("jdbc:hive2://hpchdd2-zk-1.hpc.xxx.com:2181,hpchdd2-zk-2.hpc.xxx.com:2181,hpchdd2-zk-3.hpc.xxx.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=dspprxd@ABC.XXX.COM"); }
Currently I'm getting the following exception: I'm not sure what's causing it.
java.sql.SQLException: Could not create secure connection to jdbc:hive2://hpchdd2x.hpc.xxx.com:10000/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/dspprxd@ABC.XXX.COM?transportMode=http;httpPath=cliservice: Failed to open client transport
at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:529)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:242)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:206)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:675)
at java.sql.DriverManager.getConnection(DriverManager.java:281)
at HiveJDBCKerberosClient.main(HiveJDBCKerberosClient.java:39)
Caused by: javax.security.sasl.SaslException: Failed to open client transport [Caused by java.io.IOException: Could not instantiate SASL transport]
at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:60)
at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:484)
... 6 more
Caused by: java.io.IOException: Could not instantiate SASL transport
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:223)
at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:56)
... 7 more
Caused by: javax.security.sasl.SaslException: Failure to initialize security context [Caused by org.ietf.jgss.GSSException, major code: 13, minor code: 0
major string: Invalid credentials
minor string: SubjectCredFinder: no JAAS Subject]
at com.ibm.security.sasl.gsskerb.GssKrb5Client.<init>(GssKrb5Client.java:161)
at com.ibm.security.sasl.gsskerb.FactoryImpl.createSaslClient(FactoryImpl.java:79)
at javax.security.sasl.Sasl.createSaslClient(Sasl.java:400)
at org.apache.hive.org.apache.thrift.transport.TSaslClientTransport.<init>(TSaslClientTransport.java:72)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:216)
... 8 more
Caused by: org.ietf.jgss.GSSException, major code: 13, minor code: 0
major string: Invalid credentials
Created ‎08-19-2018 01:48 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Please make sure the hiveserver2 port, if you want to connect HS2 in http mode, the default port should be 10001. So please make sure you are connecting right HS2 server that is running under http mode.
