Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

avatar
Explorer

Hey All,

 

I'm trying to run spark-shell for the first time on a CM / CDH 6.3 installation.  But getting the below instead.  

 

 

 

 

19/08/31 11:05:24 DEBUG ipc.Client: The ping interval is 60000 ms.
19/08/31 11:05:24 DEBUG ipc.Client: Connecting to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032
19/08/31 11:05:24 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
19/08/31 11:05:24 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE

19/08/31 11:05:24 DEBUG security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationClientProtocolPB info:org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo$2@238c63df
19/08/31 11:05:24 DEBUG client.RMDelegationTokenSelector: Looking for a token with service 192.168.0.133:8032
19/08/31 11:05:24 DEBUG security.SaslRpcClient: tokens aren't supported for this protocol or user doesn't have one
19/08/31 11:05:24 DEBUG security.SaslRpcClient: client isn't using kerberos
19/08/31 11:05:24 DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:05:24 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719)
19/08/31 11:05:24 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:05:24 DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:05:24 DEBUG ipc.Client: closing ipc connection to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
        at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:756)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:812)
        at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560)
        at org.apache.hadoop.ipc.Client.call(Client.java:1391)
        at org.apache.hadoop.ipc.Client.call(Client.java:1355)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy16.getClusterMetrics(Unknown Source)
        at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getClusterMetrics(ApplicationClientProtocolPBClientImpl.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
        at com.sun.proxy.$Proxy17.getClusterMetrics(Unknown Source)
        at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getYarnClusterMetrics(YarnClientImpl.java:604)
        at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:169)
        at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:169)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:57)
        at org.apache.spark.deploy.yarn.Client.logInfo(Client.scala:62)
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:168)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:60)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:186)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:511)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:43)
        at $line3.$read.<init>(<console>:45)
        at $line3.$read$.<init>(<console>:49)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
        at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
        at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)
        at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
        at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614)
        at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:799)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:795)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
        ... 95 more
19/08/31 11:05:24 DEBUG ipc.Client: IPC Client (483582792) connection to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032 from root: closed
19/08/31 11:05:24 INFO retry.RetryInvocationHandler: java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "cm-r01en01.mws.mds.xyz/192.168.0.140"; destination host is: "cm-r01nn02.mws.mds.xyz":8032; , while invoking ApplicationClientProtocolPBClientImpl.getClusterMetrics over null after 6 failover attempts. Trying to failover after sleeping for 19516ms.
19/08/31 11:05:24 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:25 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:26 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:27 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:28 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:29 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:30 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:31 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:32 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:33 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:34 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:35 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:36 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:37 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:38 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:39 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:40 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:41 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:42 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:43 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:43 DEBUG ipc.Client: The ping interval is 60000 ms.
19/08/31 11:05:43 DEBUG ipc.Client: Connecting to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032
19/08/31 11:05:43 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
19/08/31 11:05:43 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE

19/08/31 11:05:43 DEBUG security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationClientProtocolPB info:org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo$2@321558f8
19/08/31 11:05:43 DEBUG client.RMDelegationTokenSelector: Looking for a token with service 192.168.0.133:8032
19/08/31 11:05:43 DEBUG security.SaslRpcClient: tokens aren't supported for this protocol or user doesn't have one
19/08/31 11:05:43 DEBUG security.SaslRpcClient: client isn't using kerberos
19/08/31 11:05:43 DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:05:43 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719)
19/08/31 11:05:43 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:05:43 DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:05:43 DEBUG ipc.Client: closing ipc connection to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
        at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:756)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:812)
        at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560)
        at org.apache.hadoop.ipc.Client.call(Client.java:1391)
        at org.apache.hadoop.ipc.Client.call(Client.java:1355)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy16.getClusterMetrics(Unknown Source)
        at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getClusterMetrics(ApplicationClientProtocolPBClientImpl.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
        at com.sun.proxy.$Proxy17.getClusterMetrics(Unknown Source)
        at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getYarnClusterMetrics(YarnClientImpl.java:604)
        at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:169)
        at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:169)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:57)
        at org.apache.spark.deploy.yarn.Client.logInfo(Client.scala:62)
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:168)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:60)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:186)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:511)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:43)
        at $line3.$read.<init>(<console>:45)
        at $line3.$read$.<init>(<console>:49)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
        at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
        at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)
        at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
        at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614)
        at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:799)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:795)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
        ... 95 more
19/08/31 11:05:43 DEBUG ipc.Client: IPC Client (483582792) connection to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032 from root: closed
19/08/31 11:05:43 INFO retry.RetryInvocationHandler: java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "cm-r01en01.mws.mds.xyz/192.168.0.140"; destination host is: "cm-r01nn02.mws.mds.xyz":8032; , while invoking ApplicationClientProtocolPBClientImpl.getClusterMetrics over null after 7 failover attempts. Trying to failover after sleeping for 33704ms.
19/08/31 11:05:44 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:45 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:46 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:47 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:48 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:49 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:50 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:51 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:52 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:53 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:54 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:55 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:56 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:57 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:58 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:05:59 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:00 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:01 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:02 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:03 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:04 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:05 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:06 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:07 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:06:08 INFO storage.DiskBlockManager: Shutdown hook called
19/08/31 11:06:08 INFO util.ShutdownHookManager: Shutdown hook called
19/08/31 11:06:08 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9a9338cb-f16b-48e0-b0cd-7ddfcc148a13/repl-52ba4c53-3478-4ead-93e7-d20ecbd2e866
19/08/31 11:06:08 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9a9338cb-f16b-48e0-b0cd-7ddfcc148a13/userFiles-5f218430-30bb-4a7e-87df-7ee235183578
19/08/31 11:06:08 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9a9338cb-f16b-48e0-b0cd-7ddfcc148a13
19/08/31 11:06:08 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-79f3eecb-69d4-4b21-85dc-6746fc33f65c
19/08/31 11:06:08 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@19656e21
19/08/31 11:06:08 DEBUG util.ShutdownHookManager: Completed shutdown in 0.062 seconds; Timeouts: 0
19/08/31 11:06:08 DEBUG util.ShutdownHookManager: ShutdownHookManger completed shutdown.
[root@cm-r01en01 process]# dig -x 192.168.0.140

; <<>> DiG 9.9.4-RedHat-9.9.4-73.el7_6 <<>> -x 192.168.0.140
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 39821
;; flags: qr aa rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 2, ADDITIONAL: 3

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4096
;; QUESTION SECTION:
;140.0.168.192.in-addr.arpa.    IN      PTR

;; ANSWER SECTION:
140.0.168.192.in-addr.arpa. 1200 IN     PTR     cm-r01en01.mws.mds.xyz.

;; AUTHORITY SECTION:
0.168.192.in-addr.arpa. 86400   IN      NS      idmipa03.mws.mds.xyz.
0.168.192.in-addr.arpa. 86400   IN      NS      idmipa04.mws.mds.xyz.

;; ADDITIONAL SECTION:
idmipa03.mws.mds.xyz.   1200    IN      A       192.168.0.154
idmipa04.mws.mds.xyz.   1200    IN      A       192.168.0.155

;; Query time: 1 msec
;; SERVER: 192.168.0.154#53(192.168.0.154)
;; WHEN: Sat Aug 31 11:06:18 EDT 2019
;; MSG SIZE  rcvd: 169

[root@cm-r01en01 process]# dig -x 192.168.0.133

; <<>> DiG 9.9.4-RedHat-9.9.4-73.el7_6 <<>> -x 192.168.0.133
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 11817
;; flags: qr aa rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 2, ADDITIONAL: 3

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4096
;; QUESTION SECTION:
;133.0.168.192.in-addr.arpa.    IN      PTR

;; ANSWER SECTION:
133.0.168.192.in-addr.arpa. 1200 IN     PTR     cm-r01nn02.mws.mds.xyz.

;; AUTHORITY SECTION:
0.168.192.in-addr.arpa. 86400   IN      NS      idmipa04.mws.mds.xyz.
0.168.192.in-addr.arpa. 86400   IN      NS      idmipa03.mws.mds.xyz.

;; ADDITIONAL SECTION:
idmipa03.mws.mds.xyz.   1200    IN      A       192.168.0.154
idmipa04.mws.mds.xyz.   1200    IN      A       192.168.0.155

;; Query time: 1 msec
;; SERVER: 192.168.0.154#53(192.168.0.154)
;; WHEN: Sat Aug 31 11:26:10 EDT 2019
;; MSG SIZE  rcvd: 169

[root@cm-r01en01 process]#

 

 

 

 

 

I try the same as a non previlidged AD / FreeIPA user but with same results:

 

 

 

 

 

19/08/31 11:33:07 DEBUG ipc.Client: The ping interval is 60000 ms.
19/08/31 11:33:07 DEBUG ipc.Client: Connecting to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032
19/08/31 11:33:07 DEBUG security.UserGroupInformation: PrivilegedAction as:tom@mds.xyz (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
19/08/31 11:33:07 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE

19/08/31 11:33:07 DEBUG security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationClientProtocolPB info:org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo$2@6d4df1d2
19/08/31 11:33:07 DEBUG client.RMDelegationTokenSelector: Looking for a token with service 192.168.0.133:8032
19/08/31 11:33:07 DEBUG security.SaslRpcClient: tokens aren't supported for this protocol or user doesn't have one
19/08/31 11:33:07 DEBUG security.SaslRpcClient: client isn't using kerberos
19/08/31 11:33:07 DEBUG security.UserGroupInformation: PrivilegedActionException as:tom@mds.xyz (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:33:07 DEBUG security.UserGroupInformation: PrivilegedAction as:tom@mds.xyz (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719)
19/08/31 11:33:07 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:33:07 DEBUG security.UserGroupInformation: PrivilegedActionException as:tom@mds.xyz (auth:SIMPLE) cause:java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/31 11:33:07 DEBUG ipc.Client: closing ipc connection to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
        at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:756)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:812)
        at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560)
        at org.apache.hadoop.ipc.Client.call(Client.java:1391)
        at org.apache.hadoop.ipc.Client.call(Client.java:1355)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy16.getClusterMetrics(Unknown Source)
        at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getClusterMetrics(ApplicationClientProtocolPBClientImpl.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
        at com.sun.proxy.$Proxy17.getClusterMetrics(Unknown Source)
        at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getYarnClusterMetrics(YarnClientImpl.java:604)
        at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:169)
        at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:169)
        at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:57)
        at org.apache.spark.deploy.yarn.Client.logInfo(Client.scala:62)
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:168)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:60)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:186)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:511)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:43)
        at $line3.$read.<init>(<console>:45)
        at $line3.$read$.<init>(<console>:49)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
        at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
        at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)
        at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
        at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614)
        at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:799)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:795)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
        ... 95 more
19/08/31 11:33:07 DEBUG ipc.Client: IPC Client (1263257405) connection to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032 from tom@mds.xyz: closed
19/08/31 11:33:07 INFO retry.RetryInvocationHandler: java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "cm-r01en01.mws.mds.xyz/192.168.0.140"; destination host is: "cm-r01nn02.mws.mds.xyz":8032; , while invoking ApplicationClientProtocolPBClientImpl.getClusterMetrics over null after 1 failover attempts. Trying to failover after sleeping for 17516ms.
19/08/31 11:33:07 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:08 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:09 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:10 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:11 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:12 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:13 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:14 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:15 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:16 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:17 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:18 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:19 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:20 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:21 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:22 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:23 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:24 INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all
19/08/31 11:33:25 DEBUG ipc.Client: The ping interval is 60000 ms.
19/08/31 11:33:25 DEBUG ipc.Client: Connecting to cm-r01nn02.mws.mds.xyz/192.168.0.133:8032
19/08/31 11:33:25 DEBUG security.UserGroupInformation: PrivilegedAction as:tom@mds.xyz (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
19/08/31 11:33:25 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE

 

 

Has anyone seen the same and could suggest what to do to move forward with this?  


A few points:
1) Reverse and forward lookups work fine from the OS side.
2) Kerberos credentials generate without issue.

Cheers,
TK

15 REPLIES 15

avatar
Super Guru
The folder should be /user/tom, without REALM part, and the permission is the same, should be just tom:tom, without REALM.

avatar
Explorer

Odd then.  This format worked:

 

drwxr-xr-x   - tom@MDS.XYZ tom@MDS.XYZ          0 2019-09-03 21:54 /user/tom@MDS.XYZ

 

 Perhaps I should be expecting something in the settings?

avatar
Explorer

I've adjusted the auth_to_local rules as follows:

 

RULE:[2:$1@$0](HTTP@\QMWS.MDS.XYZ\E$)s/@\QMWS.MDS.XYZ\E$//
RULE:[1:$1@$0](.*@\QMWS.MDS.XYZ\E$)s/@\QMWS.MDS.XYZ\E$///L
RULE:[2:$1@$0](.*@\QMWS.MDS.XYZ\E$)s/@\QMWS.MDS.XYZ\E$///L
RULE:[2:$1@$0](HTTP@\Qmws.mds.xyz\E$)s/@\Qmws.mds.xyz\E$//
RULE:[1:$1@$0](.*@\Qmws.mds.xyz\E$)s/@\Qmws.mds.xyz\E$///L
RULE:[2:$1@$0](.*@\Qmws.mds.xyz\E$)s/@\Qmws.mds.xyz\E$///L
RULE:[2:$1@$0](HTTP@\QMDS.XYZ\E$)s/@\QMDS.XYZ\E$//
RULE:[1:$1@$0](.*@\QMDS.XYZ\E$)s/@\QMDS.XYZ\E$///L
RULE:[2:$1@$0](.*@\QMDS.XYZ\E$)s/@\QMDS.XYZ\E$///L
RULE:[2:$1@$0](HTTP@\Qmds.xyz\E$)s/@\Qmds.xyz\E$//
RULE:[1:$1@$0](.*@\Qmds.xyz\E$)s/@\Qmds.xyz\E$///L
RULE:[2:$1@$0](.*@\Qmds.xyz\E$)s/@\Qmds.xyz\E$///L
DEFAULT

 

And now when I create the folder in this manner, spark-shell starts using the below folder:

 

drwxr-xr-x - tom tom 0 2019-09-04 00:45 /user/tom

Since I have multiple users from multiple domains, collisions can occur if same user exists in two different domains.  So I would be curious if you know how to create the folders in this manner:

/user/<DOMAIN>/<USER>

 

to avoid potential conflict with same named user but in different domains.

Cheers,
TK

 

avatar
Explorer

Moving back to the original question:

 

"INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all"

 

My configration is as follows:

 

[root@cm-r01en01 conf]# cat log4j.properties
log4j.rootLogger=${root.logger}
root.logger=INFO,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
shell.log.level=INFO
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
log4j.logger.org.apache.spark.repl.Main=${shell.log.level}
log4j.logger.org.apache.spark.api.python.PythonGatewayServer=${shell.log.level}
[root@cm-r01en01 conf]#
[root@cm-r01en01 conf]#
[root@cm-r01en01 conf]# cat spark-defaults.conf
spark.authenticate=false
spark.driver.log.dfsDir=/user/spark/driverLogs
spark.driver.log.persistToDfs.enabled=true
spark.dynamicAllocation.enabled=true
spark.dynamicAllocation.executorIdleTimeout=60
spark.dynamicAllocation.minExecutors=0
spark.dynamicAllocation.schedulerBacklogTimeout=1
spark.eventLog.enabled=true
spark.io.encryption.enabled=false
spark.network.crypto.enabled=false
spark.serializer=org.apache.spark.serializer.KryoSerializer
spark.shuffle.service.enabled=true
spark.shuffle.service.port=7337
spark.ui.enabled=true
spark.ui.killEnabled=true
spark.lineage.log.dir=/var/log/spark/lineage
spark.lineage.enabled=true
spark.master=yarn
spark.submit.deployMode=client
spark.eventLog.dir=hdfs://cm-r01nn02.mws.mds.xyz:8020/user/spark/applicationHistory
spark.yarn.historyServer.address=<a href="http://cm-r01en01.mws.mds.xyz:18088" target="_blank">http://cm-r01en01.mws.mds.xyz:18088</a>
spark.yarn.jars=local:/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/spark/jars/*,local:/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/spark/hive/*
spark.driver.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/hadoop/lib/native
spark.executor.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/hadoop/lib/native
spark.yarn.am.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/hadoop/lib/native
spark.yarn.config.gatewayPath=/opt/cloudera/parcels
spark.yarn.config.replacementPath={{HADOOP_COMMON_HOME}}/../../..
spark.yarn.historyServer.allowTracking=true
spark.yarn.appMasterEnv.MKL_NUM_THREADS=1
spark.executorEnv.MKL_NUM_THREADS=1
spark.yarn.appMasterEnv.OPENBLAS_NUM_THREADS=1
spark.executorEnv.OPENBLAS_NUM_THREADS=1
spark.extraListeners=com.cloudera.spark.lineage.NavigatorAppListener
spark.sql.queryExecutionListeners=com.cloudera.spark.lineage.NavigatorQueryListener
[root@cm-r01en01 conf]#

 

 

I can get into the shell however the console is overwhelmed with the above error messages preventing me from doing anything useful with it.

 

Aiming to run a few spark commands to get started learning it.  

Cheers,
TK

avatar
Explorer

Including the log file of the shell session as a link:

https://tinyurl.com/y2kfmke8

 

 

Cheers,
TK

avatar
Explorer

Not the best approach to getting rid of these messages but it gave me what I wanted.  I set highest logging level to ERROR instead so everything else is not printed:

tom@mds.xyz@cm-r01en01:~] 🙂 $ cat /etc/spark/conf/log4j.properties
log4j.rootLogger=${root.logger}
root.logger=ERROR,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
shell.log.level=ERROR
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
log4j.logger.org.apache.spark.repl.Main=${shell.log.level}
log4j.logger.org.apache.spark.api.python.PythonGatewayServer=${shell.log.level}
tom@mds.xyz@cm-r01en01:~] 🙂 $
tom@mds.xyz@cm-r01en01:~] 🙂 $
tom@mds.xyz@cm-r01en01:~] 🙂 $ digg /etc/spark/conf/log4j.properties /etc/spark/conf/log4j.properties-original
-sh: digg: command not found
tom@mds.xyz@cm-r01en01:~] 😞 $ diff /etc/spark/conf/log4j.properties /etc/spark/conf/log4j.properties-original
2c2
< root.logger=ERROR,console
---
> root.logger=DEBUG,console
10,11c10,11
< log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR
< log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR
---
> log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
> log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
tom@mds.xyz@cm-r01en01:~] 😞 $

 

Now I get my spark-shell without the INFO, DEBUG or WARNING messages all over it.  Still interested in a final solution if possible.  I only see it fixed in Spark 3.0 . 

 

Cheers,
TK