Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

With CDH-6.3.2 Getting: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal... Unable to obtain password from user

Highlighted

With CDH-6.3.2 Getting: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal... Unable to obtain password from user

New Contributor

Testing CDH 6.3.2 with Analytic Server and Kerberos enabled, the following exception is now encountered (previous releases of CDH do not exhibit this behavior):

 

20/03/18 12:39:05 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: admin@EXAMPLE.COM javax.security.auth.login.LoginException: Unable to obtain password from user

at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1992)
at org.apache.hadoop.security.UserGroupInformation.getUGIFromTicketCache(UserGroupInformation.java:649)
at org.apache.spark.deploy.security.HadoopDelegationTokenManager.doLogin(HadoopDelegationTokenManager.scala:276)
at org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens(HadoopDelegationTokenManager.scala:140)
at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:407)
at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:401)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.start(CoarseGrainedSchedulerBackend.scala:401)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:46)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:186)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:511)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:72)
at com.spss.ae.spark.multiclient.jobs.RunSparkSide$1.apply(RunSparkSide.java:449)
at com.spss.ae.spark.multiclient.jobs.RunSparkSide$1.apply(RunSparkSide.java:444)
at com.spss.utilities.classloading.ClassLoadingUtils.executeWithThreadClassLoader(ClassLoadingUtils.java:311)
at com.spss.ae.spark.multiclient.jobs.RunSparkSide.createSparkContext(RunSparkSide.java:444)
at com.spss.ae.spark.multiclient.jobs.RunSparkSide.getSparkContext(RunSparkSide.java:415)
at com.spss.ae.spark.multiclient.jobs.RunSparkSide.run(RunSparkSide.java:182)
at com.spss.ae.spark.multiclient.sparkdriver.SparkJob$1$1.apply(SparkJob.java:44)
at com.spss.ae.spark.multiclient.sparkdriver.SparkJob$1$1.apply(SparkJob.java:40)
at com.spss.utilities.thread.ThreadScope.doWith(ThreadScope.java:46)
at com.spss.ae.security.UserScope.doWith(UserScope.java:46)
at com.spss.ae.spark.multiclient.sparkdriver.SparkJob$1.apply(SparkJob.java:40)
at com.spss.ae.spark.multiclient.sparkdriver.SparkJob$1.apply(SparkJob.java:37)
at com.spss.utilities.thread.ThreadScope.doWith(ThreadScope.java:46)
at com.spss.ae.spark.multiclient.sparkdriver.JobScope.doWith(JobScope.java:45)
at com.spss.ae.spark.multiclient.sparkdriver.SparkJob.start(SparkJob.java:37)
at com.spss.ae.spark.multiclient.sparkdriver.SparkRunnerClient$3.call(SparkRunnerClient.java:375)
at com.spss.ae.spark.multiclient.sparkdriver.SparkRunnerClient$3.call(SparkRunnerClient.java:372)
at com.spss.ae.hdfs.auth.impl.HdfsAuth$7$1.run(HdfsAuth.java:623)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at com.spss.ae.hdfs.auth.impl.HdfsAuth$7.run(HdfsAuth.java:620)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
at com.spss.ae.hdfs.auth.impl.HdfsAuth.nonImpersonatedAuth(HdfsAuth.java:604)
at com.spss.ae.hdfs.auth.impl.HdfsAuth.nonImpersonatedAuth(HdfsAuth.java:246)
at com.spss.ae.spark.multiclient.sparkdriver.SparkRunnerClient.runSparkJob(SparkRunnerClient.java:370)
at com.spss.ae.spark.multiclient.messages.SubmitJobMessage.lambda$process$0(SubmitJobMessage.java:27)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: javax.security.auth.login.LoginException: Unable to obtain password from user

at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:875)
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:738)
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:592)
at java.base/javax.security.auth.login.LoginContext.invoke(LoginContext.java:726)
at java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:665)
at java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:663)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:663)
at java.base/javax.security.auth.login.LoginContext.login(LoginContext.java:574)
at org.apache.hadoop.security.UserGroupInformation$HadoopLoginContext.login(UserGroupInformation.java:2070)
at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1982)
... 45 more

On the surface, this may seem like a Analytic Server coding issue; however, I would add the following detail to support the theory that it is not:

 

Just prior to getting this exception, Analytic Server code successfully executed these lines of code

 

UserGroupInformation.loginUserFromSubject(subject);
log.debug("HdfsAuth.nonImpersonatedAuth: UGI login successfull");

 

As evidenced by the logging of the debug line:

20/03/18 12:39:04 DEBUG impl.HdfsAuth: HdfsAuth.nonImpersonatedAuth: UGI login successfull

 

Note that UserGroupInformation.loginUserFromSubject code looks like this:

@InterfaceAudience.Public
@InterfaceStability.Evolving
public static void loginUserFromSubject(Subject subject) throws IOException {
setLoginUser(createLoginUser(subject));
}

 

And that createLoginUser does this:

private static
UserGroupInformation createLoginUser(Subject subject) throws IOException {
UserGroupInformation realUser = doSubjectLogin(subject, null);
UserGroupInformation loginUser = null;

 

What I'd like to point out here is that both paths eventually resolve to doSubjectLogin...The one in the AnalyticServer invocation path was successful, the one from HadoopDelegationTokenManager.doLogin(HadoopDelegationTokenManager.scala:276) was not.....It seems HadoopDelegationTokenManager.doLogin assumes useTicketCache is true?

 

Additionally org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens to doLogin isn't added until Spark 3, but we are using Spark 2.4.0 as per what was installed with CDH 6.3.2....Is there a possible jar packaging problem?

Don't have an account?
Coming from Hortonworks? Activate your account here