Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive Interpreters not working after HDP upgrade to 2.6.3

Hive Interpreters not working after HDP upgrade to 2.6.3

New Contributor

I recently performed a rolling upgrade of the HDP from 2.6.0 to 2.6.3. After the upgrade zeppelin hive interpreter is not working despite making all the changes as per recommendation. The error screenshot is as below,

org.apache.zeppelin.interpreter.InterpreterException: Error in doAs at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnection(JDBCInterpreter.java:426) at org.apache.zeppelin.jdbc.JDBCInterpreter.executeSql(JDBCInterpreter.java:644) at org.apache.zeppelin.jdbc.JDBCInterpreter.interpret(JDBCInterpreter.java:763) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:101) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:502) at org.apache.zeppelin.scheduler.Job.run(Job.java:175) at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.UndeclaredThrowableException at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1713) at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnection(JDBCInterpreter.java:418) ... 13 more Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://<hostname>:10000/default;principal=hive/<hostname>@<realm>: GSS initiate failed at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.commons.dbcp2.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:79) at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:205) at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:861) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363) at org.apache.commons.dbcp2.PoolingDriver.connect(PoolingDriver.java:129) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:270) at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnectionFromPool(JDBCInterpreter.java:373) at org.apache.zeppelin.jdbc.JDBCInterpreter.access$000(JDBCInterpreter.java:91) at org.apache.zeppelin.jdbc.JDBCInterpreter$1.run(JDBCInterpreter.java:421) at org.apache.zeppelin.jdbc.JDBCInterpreter$1.run(JDBCInterpreter.java:418) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) ... 14 more Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204) ... 33 more

Note: I have already added below parameters as per suggestion but it is not working.

hive.proxy.user.property hive.server2.proxy.user
4 REPLIES 4

Re: Hive Interpreters not working after HDP upgrade to 2.6.3

Mentor

@Sarfaraj Ahmad


Can you check your JDBC URL for hive? Error SQLException: Could not open client transport with JDBC Uri:

jdbc:hive2://<hostname>:10000/default;principal=hive/<hostname>@<realm>: 

Look for the Hive Interpreters property =hive.url

Looks like your environment is kerberized.

Please revert

Highlighted

Re: Hive Interpreters not working after HDP upgrade to 2.6.3

New Contributor

Hi Geoffery, The cluster is kerberized and the URL works fine when used from beeline and third party tools like DbVisualizer and Rstudio using JDBC connectors. This was working fine before the upgrade to latest HDP edition 2.6.3.

Re: Hive Interpreters not working after HDP upgrade to 2.6.3

@Sarfaraj Ahmad,

Please set these 2 params are set for the interpreter and try hitting the query

zeppelin.jdbc.keytab.location=/etc/security/keytabs/zeppelin.server.kerberos.keytab
zeppelin.jdbc.principal=zeppelin@EXAMPLE.COM

Thanks,

Aditya

Re: Hive Interpreters not working after HDP upgrade to 2.6.3

New Contributor

Hi Aditya,

Here is my interpreter settings, I do not really thin that I have missed any settings which is recommended by zeppelin/hortonworks.

The issue is specific to 0.7.3 version of zeppelin while the same settings are working fine with zeppelin version 0.7.0

common.max_count
default.completer.schemaFilters
default.driverorg.apache.hive.jdbc.HiveDriver
default.password
default.precode
default.splitQueriesfalse
hive.urljdbc:hive2://<hiveserver2 hostname>:10000/default;principal=hive/<hiveserver2 hostname>@<realm>
default.userzeppelin
hive.proxy.user.propertyhive.server2.proxy.user
zeppelin.interpreter.localRepo/usr/hdp/current/zeppelin-server/local-repo/2CZHJ3YHC
zeppelin.interpreter.output.limit
zeppelin.jdbc.auth.typeKERBEROS
zeppelin.jdbc.concurrent.max_connection
zeppelin.jdbc.concurrent.usetrue
zeppelin.jdbc.keytab.location/etc/security/keytabs/zeppelin.server.kerberos.keytab
zeppelin.jdbc.principalzeppelin-<clustername>@<realm>
Dependencies
artifactexclude
org.apache.hive:hive-jdbc:1.2.1
org.apache.hadoop:hadoop-common:2.7.3

Regards

Sarfaraj

Don't have an account?
Coming from Hortonworks? Activate your account here