Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

kerberos: Authentication failed, status: 404, message: Not Found

avatar
Expert Contributor

I'm running in a kerberized cluster.

I try to run any spark job and I get the following:

[spark_remote@ip-172-31-10-196 ~]$ spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster /usr/lib/spark/examples/jars/spark-examples.jar
Warning: Master yarn-cluster is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead.
18/01/31 19:42:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/31 19:42:20 INFO RMProxy: Connecting to ResourceManager at ip-172-31-10-196.us-west-2.compute.internal/172.31.10.196:8032
18/01/31 19:42:20 INFO Client: Requesting a new application from cluster with 0 NodeManagers
18/01/31 19:42:20 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (11520 MB per container)
18/01/31 19:42:20 INFO Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
18/01/31 19:42:20 INFO Client: Setting up container launch context for our AM
18/01/31 19:42:20 INFO Client: Setting up the launch environment for our AM container
18/01/31 19:42:20 INFO Client: Preparing resources for our AM container
18/01/31 19:42:20 INFO HadoopFSCredentialProvider: getting token for: hdfs://ip-172-31-10-196.us-west-2.compute.internal:8020/user/spark_remote
18/01/31 19:42:20 INFO DFSClient: Created HDFS_DELEGATION_TOKEN token 20 for spark_remote on 172.31.10.196:8020
Exception in thread "main" java.io.IOException: java.lang.reflect.UndeclaredThrowableException
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:888)
	at org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.addDelegationTokens(KeyProviderDelegationTokenExtension.java:86)
	at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2234)
	at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider$$anonfun$obtainCredentials$1.apply(HadoopFSCredentialProvider.scala:52)
	at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider$$anonfun$obtainCredentials$1.apply(HadoopFSCredentialProvider.scala:49)
	at scala.collection.immutable.Set$Set1.foreach(Set.scala:94)
	at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider.obtainCredentials(HadoopFSCredentialProvider.scala:49)
	at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:82)
	at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:80)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
	at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
	at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager.obtainCredentials(ConfigurableCredentialManager.scala:80)
	at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:389)
	at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:832)
	at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:170)
	at org.apache.spark.deploy.yarn.Client.run(Client.scala:1109)
	at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1168)
	at org.apache.spark.deploy.yarn.Client.main(Client.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.UndeclaredThrowableException
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1713)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:870)
	... 31 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 404, message: Not Found
	at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:275)
	at org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:131)
	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:131)
	at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.doDelegationTokenOperation(DelegationTokenAuthenticator.java:288)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.getDelegationToken(DelegationTokenAuthenticator.java:169)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.getDelegationToken(DelegationTokenAuthenticatedURL.java:373)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:875)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:870)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
	... 32 more

The message is odd, after changing the principle of KMS

Authentication failed, status: 404, message: Not Found

Any hints of where to look would be appreciated.. .there isn't anything in the KDC log:

Jan 31 14:49:54 ip-172-31-11-134.us-west-2.compute.internal krb5kdc[9279](info): TGS_REQ (2 etypes {18 17}) 172.31.10.196: ISSUE: authtime 1517428183, etypes {rep=18 tkt=18 ses=18}, spark_remote/ip-172-31-10-196.us-west-2.compute.internal@DATAPASSPORT.INTERNAL for yarn/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL
Jan 31 14:49:54 ip-172-31-11-134.us-west-2.compute.internal krb5kdc[9279](info): closing down fd 11
Jan 31 14:49:54 ip-172-31-11-134.us-west-2.compute.internal krb5kdc[9279](info): TGS_REQ (2 etypes {18 17}) 172.31.10.196: ISSUE: authtime 1517428183, etypes {rep=18 tkt=18 ses=18}, spark_remote/ip-172-31-10-196.us-west-2.compute.internal@DATAPASSPORT.INTERNAL for hdfs/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL
Jan 31 14:49:54 ip-172-31-11-134.us-west-2.compute.internal krb5kdc[9279](info): closing down fd 11


1 ACCEPTED SOLUTION

avatar

@Matt Andruff

You should check the url in core-site/hadoop.security.key.provider.path to see if it is a valid url. It is apparently not pointing to the correct location.

View solution in original post

10 REPLIES 10

avatar

@Matt Andruff

You should check the url in core-site/hadoop.security.key.provider.path to see if it is a valid url. It is apparently not pointing to the correct location.

avatar
Expert Contributor

@Robert Levas

<property>
    <name>hadoop.security.key.provider.path</name>
    <value>kms://http@ip-172-31-10-196.us-west-2.compute.internal:9700/kms</value>
  </property>

Looks valid... what logs can I check?

avatar

@vperiasamy would you be able to help out on this KMS issue?

avatar
Expert Contributor
@Robert Levas

I'm going to give you the answer because I found this article you wrote about rule syntax and clearly that's my issue..

avatar

Thanks. Sorry I didn't know what log to look in. Ranger and KMS is not my forte.

Thanks, @vperiasamy for contributing to the effort.

avatar
Expert Contributor

I followed this article. It tell you how to configure KMS. That is what I followed immediately before getting the 404. Is it possible that by following that aricle I'm making KMS crash and hence the 404? How would I look at the error log for KMS. It seems to be a web app but I can't seem to find a log for it.

avatar

avatar
Expert Contributor

@vperiasamy thanks for your response.

Proxy user is set to *

  <property>
    <name>hadoop.kms.proxyuser.hdfs.hosts</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.hdfs.groups</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.hdfs.users</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.hive.groups</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.HTTP.groups</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.HTTP.users</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.hive.users</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.hive.hosts</name>
    <value>*</value>
  </property>


  <property>
    <name>hadoop.kms.proxyuser.HTTP.hosts</name>
    <value>*</value>
  </property>


<br>

Looks like I followed an article that was wrong.( @Sindhu )

Here's the log I found callilng out that the hadoop.kms.authentication.kerberos.name.rules are wrong

/var/log/hadoop-kms/kms-localhost.2018-01-31.log

Caused by: java.lang.IllegalArgumentException: Invalid rule: hdfs/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL,
        spark/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL,
        yarn/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL,
        HTTP/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL
        at org.apache.hadoop.security.authentication.util.KerberosName.parseRules(KerberosName.java:331)
        at org.apache.hadoop.security.authentication.util.KerberosName.setRules(KerberosName.java:397)
        at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.init(KerberosAuthenticationHandler.java:210)
        ... 31 more


<br>

avatar
Expert Contributor

So it's part of the question. I found the logs

/var/log/hadoop-kms/kms-localhost.2018-01-31.log

Caused by: java.lang.IllegalArgumentException: Invalid rule: hdfs/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL,
        spark/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL,
        yarn/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL,
        HTTP/ip-172-31-10-196.us-west-2.compute.internal@MYREALM.INTERNAL
        at org.apache.hadoop.security.authentication.util.KerberosName.parseRules(KerberosName.java:331)
        at org.apache.hadoop.security.authentication.util.KerberosName.setRules(KerberosName.java:397)
        at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.init(KerberosAuthenticationHandler.java:210)
        ... 31 more

Looks like my rules that were badly written caused the issue.