Support Questions
Find answers, ask questions, and share your expertise

Spark Submit Job Failing in cluster mode while loading data to hive

New Contributor

Hi,

We are having HDP 3.1.5 cluster, where user is trying to run spark jobs which ingest data from 3rd  party tool and loads to hive, Here we are using HS2 and below is the Spark submit command.

*********************************************

/usr/hdp/current/spark2-client/bin/spark-submit --master yarn --queue udif --driver-memory ${​​driver_memory}​​ --num-executors ${​​num_executors}​​ --executor-memory ${​​executor_memory}​​ --executor-cores 4 --conf spark.port.maxRetries=50 --conf spark.network.timeout=600s --conf spark.executor.heartbeatInterval=200s --class com.reliance.cpds.refinitiv.ingestion.DataIngestionController --conf spark.security.credentials.hiveserver2.enabled=true --conf spark.sql.hive.hiveserver2.jdbc.url="hive-jdbc string"  --jars custom-jars

 

ERROR:

Caused by: java.sql.SQLException: Could not open client transport for any of the Server URI's in ZooKeeper: Could not establish connection to jdbc:hive2://hive-server2-host:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;auth=delegationToken: HTTP Response code: 401
        at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:344)
        at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
        at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:53)
        at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:291)
        at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:883)
        at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:436)
        at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:365)
        at org.apache.commons.dbcp2.PoolingDataSource.getConnection(PoolingDataSource.java:134)
        at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1563)
        at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:424)
        at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:453)
        at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:85)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeUpdate(HiveWarehouseSessionImpl.java:205)

*********************************************

 

This issue is hitting intermediately.

@gulshad_ansari 

Thanks
ASIF.

 

 

0 REPLIES 0