Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Connection to the Hive metastore from a Spark Job (on a Kerberos environment)


Connection to the Hive metastore from a Spark Job (on a Kerberos environment)

New Contributor


I have to create and write into hive tables executed from a spark job. I instantiate an HiveContext and its configuration with the following code:

val sparkConf = new SparkConf(true) 
implicit val sc = new SparkContext(sparkConf) 
implicit val sqlContext = new HiveContext(sc) 
sqlContext.setConf("hive.exec.dynamic.partition", "true")
sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
sqlContext.setConf("hive.metastore.uris", "thrift://xxxxxxx:9083")

The program stops at the creation of the table which is the line:


On a non Kerberised environment the code works but since we have now a new environment which is kerberised it doesn't work anymore.

Here is the error stack trace:

17/12/06 16:29:21 ERROR Driver: FAILED: IllegalStateException Unxpected Exception thrown: Unable to fetch table xxxx. null
java.lang.IllegalStateException: Unxpected Exception thrown: Unable to fetch table xxxx. null
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(
	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(
	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(
	at org.apache.hadoop.hive.ql.Driver.compile(
	at org.apache.hadoop.hive.ql.Driver.compile(
	at org.apache.hadoop.hive.ql.Driver.compileInternal(
	at org.apache.hadoop.hive.ql.Driver.runInternal(

We have also noticed this warn message:

17/12/06 16:29:20 WARN metastore: set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
	at org.apache.thrift.transport.TTransport.readAll(
	at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
	at org.apache.thrift.TServiceClient.receiveBase(
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
	at java.lang.reflect.Constructor.newInstance(
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(
	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(
	at org.apache.hadoop.hive.ql.metadata.Hive.getTable(

Anyone can help me please ?


Re: Connection to the Hive metastore from a Spark Job (on a Kerberos environment)

Expert Contributor
@Joffrey C

17/12/0616:29:20 WARN metastore: set_ugi()not successful,Likely cause:new client talking to old server.Continuing without it.

This mostly happens when spark is using the wrong hive-site.xml file, if you notice /etc/spark/conf will have a separate hive-site.xml file which is not same as /etc/hive/conf/hive-site.xml , if you done the upgrade and replaced /etc/spark/conf/hive-site.xml with /etc/hive/conf/hive-site.xml then these kind of issues occur.


Re: Connection to the Hive metastore from a Spark Job (on a Kerberos environment)

New Contributor

Thanks you.

But it's not working :(, do you have any idea why or another solution ?

Don't have an account?
Coming from Hortonworks? Activate your account here