Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Problem with Zeppelin using Spark and Livy

Problem with Zeppelin using Spark and Livy

New Contributor

Hi,

i have a problem with Zeppelin using Livy interpreter with Spark2 2.3.2 and Hive 3.1.0.

The cluster have Ambari 2.7.3, HDP-3.1.0 and kerberos installed.

Normal Spark without Livy works fine with LLAP. No problem there at all.

But when i try to execut a job like the this

%livy2.spark 
val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build 
hive.execute("select count (*) from <table_name>").show

i retrive the following error

java.lang.RuntimeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Could not open client transport for any of the Server URI's in ZooKeeper: Could not establish connection to jdbc:hive2://<edge_hostname>:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;auth=delegationToken: HTTP Response code: 401)
  at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.execute(HiveWarehouseSessionImpl.java:70)
  ... 50 elided
Caused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Could not open client transport for any of the Server URI's in ZooKeeper: Could not establish connection to jdbc:hive2://<edge_hostname>:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;auth=delegationToken: HTTP Response code: 401)
  at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2291)
  at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
  at org.apache.commons.dbcp2.BasicDataSource.getLogWriter(BasicDataSource.java:1588)
  at org.apache.commons.dbcp2.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:588)
  at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:333)
  at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:340)
  at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
  at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:48)
  at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.execute(HiveWarehouseSessionImpl.java:66)
  ... 50 more
Caused by: java.sql.SQLException: Could not open client transport for any of the Server URI's in ZooKeeper: Could not establish connection to jdbc:hive2://<edge_hostname>:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;auth=delegationToken: HTTP Response code: 401
  at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:333)
  at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
  at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
  at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
  at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2301)
  at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2287)
  ... 58 more
Caused by: java.sql.SQLException: Could not establish connection to jdbc:hive2://<edge_hostname>:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;auth=delegationToken: HTTP Response code: 401
  at shadehive.org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:815)
  at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:305)
  ... 63 more
Caused by: org.apache.thrift.transport.TTransportException: HTTP Response code: 401
  at org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:262)
  at org.apache.thrift.transport.THttpClient.flush(THttpClient.java:313)
  at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)
  at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
  at shadehive.org.apache.hive.service.rpc.thrift.TCLIService$Client.send_OpenSession(TCLIService.java:170)
  at shadehive.org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:162)
  at shadehive.org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:796)
  ... 64 more

There is the settings for Livy interpreter

livy.spark.hadoop.hive.llap.daemon.service.hosts	@llap0
livy.spark.hadoop.hive.zookeeper.quorum			<zk_host1>:2181;<zk_host2>:2181;<zk_host3>:2181
livy.spark.security.credentials.hiveserver2.enabled	true
livy.spark.sql.hive.hiveserver2.jdbc.url		jdbc:hive2://<zk_host1>:2181,<zk_host2>:2181,<zk_host3>:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
livy.spark.sql.hive.hiveserver2.jdbc.url.principal	hive/_<kerberos_principal>

I checked many times all the configuration settings but they're set like the documentation.

Can someone help me?

Thanks

4 REPLIES 4

Re: Problem with Zeppelin using Spark and Livy

Mentor

@Marco Caron

The problem you are encountering is the misconfiguration of the jdbc connection. Response code :401 HTTP response status code indicates that the request sent by the client could not be authenticated. Have a look at this HCC document connect HS2 connect strings

Caused by: java.sql.SQLException: Could not establish connection to jdbc:hive2://<edge_hostname>:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;auth=delegationToken: HTTP Response code: 401

Under Advanced hive-interactive-site, the HiveServer2 port is typically 10500 can you check the values on your cluster?

HTH

Re: Problem with Zeppelin using Spark and Livy

New Contributor

Hi,

under Advanced hive-interactive-site, the HiveServer2 port is set to 10500

Re: Problem with Zeppelin using Spark and Livy

Mentor

@Marco Caron

Any updates on this issue ,do you still need help to resolve the problem?

Highlighted

Re: Problem with Zeppelin using Spark and Livy

Mentor

@Marco Caron
Any updates on this issue ,do you still need help to resolve the problem?