Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Got GSSException using Spark Igniter in a secure cluster on CDH5.3.0

avatar
New Contributor

Hi,

I'm using CDH5.3.0, and I enabled Spark App on Hue by changing the blacklist of Hue. I also built Spark Job server and made it started. Uploading test jar file to Spark on Hue works fine, but when I tried to run 'spark.jobserver.WordCountExample' I got an exception says failed to find Kerberos tgt.

I've enable Kerberos security for my cluster and the ticket renewer service is working fine. All other functions on Hue are working, except for Spark Igniter.

What kind of configuration change shall I make in order to make Spark Igniter works in a secure cluster env?

I would appreciate if anyone can share any thought on this. Thanks a lot!

 

Here is the error log:

{ "status": "ERROR",
"result": { "errorClass": "java.lang.RuntimeException",
"cause": "Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: \"XXX.com/192.168.10.66\"; destination host is: \"XXX.com\":8032; ",
"stack": ["org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)",
"org.apache.hadoop.ipc.Client.call(Client.java:1415)",
"org.apache.hadoop.ipc.Client.call(Client.java:1364)",
"org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)",
"com.sun.proxy.$Proxy8.getClusterMetrics(Unknown Source)",
"org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getClusterMetrics(ApplicationClientProtocolPBClientImpl.java:178)",
"sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
"sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)",
"sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
"java.lang.reflect.Method.invoke(Method.java:483)",
"org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)",
"org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)",
"com.sun.proxy.$Proxy9.getClusterMetrics(Unknown Source)",
"org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getYarnClusterMetrics(YarnClientImpl.java:427)",
"org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:69)",
"org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:69)",
"org.apache.spark.Logging$class.logInfo(Logging.scala:59)",
"org.apache.spark.deploy.yarn.Client.logInfo(Client.scala:35)",
"org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:68)",
"org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)",
"org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:140)",
"org.apache.spark.SparkContext.<init>(SparkContext.scala:335)",
"spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34)",
"spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:251)",
"spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:103)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)",
"ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)",
"ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26)",
"ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35)",
"ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)",
"ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:24)",
"akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)",
"akka.actor.ActorCell.invoke(ActorCell.scala:456)",
"akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)",
"akka.dispatch.Mailbox.run(Mailbox.scala:219)",
"akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:385)",
"scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)",
"scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)",
"scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)",
"scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)"], "causingClass": "java.io.IOException",
"message": "java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: \"XXX.com/192.168.10.66\"; destination host is: \"XXX.com\":8032; " } } (error 500)

1 ACCEPTED SOLUTION

avatar
Super Guru
There is not support for kerberos in this app.

However, a better app is coming soon, and security is targeted for mid year,

Romain

View solution in original post

2 REPLIES 2

avatar
Super Guru
There is not support for kerberos in this app.

However, a better app is coming soon, and security is targeted for mid year,

Romain

avatar
New Contributor

Thank you so much for your quick explanation. Good to know something awesome coming soon. 🙂