Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Using Cloudera Livy Client in Kerberized HDP 2.5

avatar
Expert Contributor

I found this library on Cloudera's GitHub repository: https://github.com/cloudera/livy

I also found this Java example here: https://github.com/cloudera/livy#using-the-programmatic-api

This works perfectly for an unsecured HDP cluster. In the Kerberized HDP cluster the first line works (authentication?) but the third won't:

// This line seems to work!
LivyClient client = new LivyClientBuilder().setURI(new URI(livyUrl)).build();

// This does not work and throws the exception!		
System.err.printf("Uploading %s to the Spark context...\n", piJar);
client.uploadJar(new File(piJar)).get();

But when I try to use it with my kerberized cluster, I get the following exception:

Uploading C:/livy/pijob.jar to the Spark context...
java.util.concurrent.ExecutionException: java.io.IOException: Internal Server Error: "java.util.concurrent.ExecutionException: java.io.IOException: RSCClient instance stopped."
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at cloudera.Main.main(Main.java:32)
Caused by: java.io.IOException: Internal Server Error: "java.util.concurrent.ExecutionException: java.io.IOException: RSCClient instance stopped."
	at cloudera.LivyConnection.sendRequest(LivyConnection.java:236)
	at cloudera.LivyConnection.post(LivyConnection.java:199)
	at cloudera.HttpClient$2.call(HttpClient.java:153)
	at cloudera.HttpClient$2.call(HttpClient.java:1)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

And my log-files say the following (very large log, this is only a snippet):

INFORMATION: 17/04/24 14:14:57 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO ObjectStore: Initialized ObjectStore
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO HiveMetaStore: Added admin role in metastore
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO HiveMetaStore: Added public role in metastore
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO HiveMetaStore: No user is added in admin role, since config is empty
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO HiveMetaStore: 0: get_all_databases
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO audit: ugi=livy     ip=unknown-ip-addr      cmd=get_all_databases
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO HiveMetaStore: 0: get_functions: db=default pat=*
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO audit: ugi=livy     ip=unknown-ip-addr      cmd=get_functions: db=default pat=*
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO SessionState: Created local directory: /tmp/c000b152-4dfe-4716-9b64-ff7bcf0558f4_resources
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO SessionState: Created HDFS directory: /tmp/hive/livy/c000b152-4dfe-4716-9b64-ff7bcf0558f4
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO SessionState: Created local directory: /tmp/livy/c000b152-4dfe-4716-9b64-ff7bcf0558f4
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO SessionState: Created HDFS directory: /tmp/hive/livy/c000b152-4dfe-4716-9b64-ff7bcf0558f4/_tmp_space.db
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO HiveContext: default warehouse location is /user/hive/warehouse
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO ClientWrapper: Inspected Hadoop version: 2.7.3.2.5.0.0-1245
Apr 24, 2017 2:14:57 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:57 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.3.2.5.0.0-1245
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:58 INFO metastore: Trying to connect to metastore with URI thrift://had-job.my-server-url.de:9083
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:58 ERROR TSaslTransport: SASL negotiation failure
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at java.security.AccessController.doPrivileged(Native Method)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at javax.security.auth.Subject.doAs(Subject.java:422)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:204)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:345)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:255)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:459)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:233)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:236)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:95)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:82)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.cloudera.livy.repl.SparkInterpreter.restoreContextClassLoader(SparkInterpreter.scala:305)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:82)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:59)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:57)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at java.lang.Thread.run(Thread.java:745)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    ... 49 more
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:58 WARN metastore: Failed to connect to the MetaStore Server...
Apr 24, 2017 2:14:58 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:14:58 INFO metastore: Waiting 5 seconds before next connection attempt.
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:15:03 INFO metastore: Trying to connect to metastore with URI thrift://had-job.my-server-url.de:9083
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: 17/04/24 14:15:03 ERROR TSaslTransport: SASL negotiation failure
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at java.security.AccessController.doPrivileged(Native Method)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at javax.security.auth.Subject.doAs(Subject.java:422)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect
INFORMATION:    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
Apr 24, 2017 2:15:03 PM org.apache.spark.launcher.OutputRedirector redirect

...


17/04/24 14:15:28 ERROR SessionServlet$: internal error
java.util.concurrent.ExecutionException: java.io.IOException: RSCClient instance stopped.
        at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
        at com.cloudera.livy.rsc.JobHandleImpl.get(JobHandleImpl.java:60)
        at com.cloudera.livy.server.interactive.InteractiveSession.addJar(InteractiveSession.scala:230)
        at com.cloudera.livy.server.interactive.InteractiveSession.addJar(InteractiveSession.scala:220)
        at com.cloudera.livy.server.interactive.InteractiveSessionServlet$$anonfun$17$$anonfun$apply$12.apply(InteractiveSessionServlet.scala:178)
        at com.cloudera.livy.server.interactive.InteractiveSessionServlet$$anonfun$17$$anonfun$apply$12.apply(InteractiveSessionServlet.scala:175)
        at com.cloudera.livy.server.SessionServlet.doWithSession(SessionServlet.scala:198)
        at com.cloudera.livy.server.SessionServlet.withSession(SessionServlet.scala:191)
        at com.cloudera.livy.server.interactive.InteractiveSessionServlet$$anonfun$17.apply(InteractiveSessionServlet.scala:175)
        at org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$liftAction(ScalatraBase.scala:270)
        at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265)
        at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265)
        at org.scalatra.ApiFormats$class.withRouteMultiParams(ApiFormats.scala:178)
        at com.cloudera.livy.server.JsonServlet.withRouteMultiParams(JsonServlet.scala:39)
        at org.scalatra.ScalatraBase$class.invoke(ScalatraBase.scala:264)
        at org.scalatra.ScalatraServlet.invoke(ScalatraServlet.scala:49)
        at org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:240)
        at org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:238)
        at scala.Option.flatMap(Option.scala:170)
        at org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:238)
        at org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:237)
        at scala.collection.immutable.Stream.flatMap(Stream.scala:446)
        at org.scalatra.ScalatraBase$class.runRoutes(ScalatraBase.scala:237)
        at org.scalatra.ScalatraServlet.runRoutes(ScalatraServlet.scala:49)
        at org.scalatra.ScalatraBase$class.runActions$1(ScalatraBase.scala:163)
        at org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply$mcV$sp(ScalatraBase.scala:175)
        at org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175)
        at org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175)
        at org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$cradleHalt(ScalatraBase.scala:193)
        at org.scalatra.ScalatraBase$class.executeRoutes(ScalatraBase.scala:175)
        at org.scalatra.ScalatraServlet.executeRoutes(ScalatraServlet.scala:49)
        at org.scalatra.ScalatraBase$$anonfun$handle$1.apply$mcV$sp(ScalatraBase.scala:113)
        at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113)
        at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80)
        at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49)
        at org.scalatra.DynamicScope$$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71)
        at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49)
        at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59)
        at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49)
        at org.scalatra.ScalatraBase$class.handle(ScalatraBase.scala:111)
        at org.scalatra.ScalatraServlet.org$scalatra$servlet$ServletBase$$super$handle(ScalatraServlet.scala:49)
        at org.scalatra.servlet.ServletBase$class.handle(ServletBase.scala:43)
        at com.cloudera.livy.server.SessionServlet.org$scalatra$MethodOverride$$super$handle(SessionServlet.scala:40)
        at org.scalatra.MethodOverride$class.handle(MethodOverride.scala:28)
        at com.cloudera.livy.server.interactive.InteractiveSessionServlet.org$scalatra$servlet$FileUploadSupport$$super$handle(InteractiveSessionServlet.scala:40)
        at org.scalatra.servlet.FileUploadSupport$class.handle(FileUploadSupport.scala:93)
        at com.cloudera.livy.server.interactive.InteractiveSessionServlet.handle(InteractiveSessionServlet.scala:40)
        at org.scalatra.ScalatraServlet.service(ScalatraServlet.scala:54)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
        at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
        at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1669)
        at com.cloudera.livy.server.CsrfFilter.doFilter(CsrfFilter.scala:44)
        at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:614)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:573)
        at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
        at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
        at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
        at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
        at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
        at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
        at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
        at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
        at org.eclipse.jetty.server.Server.handle(Server.java:499)
        at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
        at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
        at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
        at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
        at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: RSCClient instance stopped.
        at com.cloudera.livy.rsc.RSCClient.stop(RSCClient.java:232)
        at com.cloudera.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:120)
        at com.cloudera.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:115)
        at com.cloudera.livy.rsc.Utils$2.operationComplete(Utils.java:108)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:567)
        at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:406)
        at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82)
        at io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:870)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:550)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.closeOnRead(AbstractNioByteChannel.java:71)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:157)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
        ... 1 more

I set the following Kerberos settings:

System.setProperty("java.security.auth.login.config", "C:/kerberos/jaasA.conf");
System.setProperty("java.security.krb5.conf", "C:/kerberos/krb5.conf");
System.setProperty("sun.security.krb5.debug", "true");
// This is needed to get Kerberos credentials from the environment, instead of
// requiring the application to manually obtain the credentials.
System.setProperty("javax.security.auth.useSubjectCredsOnly", "false");

Can someone give an example, how to use this Livy Client for a secured HDP 2.5 cluster? Thank you!

1 REPLY 1

avatar
Rising Star

I ran into this same issue a few weeks ago using Zeppelin to run Livy, make sure that you copy over the Hive hive-site.xml into the spark/conf directory on every node in the cluster, this will alleviate the inability to connect to the Hive Metastore more often than not. Please let me know the status after you try this or if you have already done that, so we can continue troubleshooting.