Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

Highlighted

Re: DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

Guru
The folder should be /user/tom, without REALM part, and the permission is the same, should be just tom:tom, without REALM.

Re: DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

Explorer

Odd then.  This format worked:

 

drwxr-xr-x   - tom@MDS.XYZ tom@MDS.XYZ          0 2019-09-03 21:54 /user/tom@MDS.XYZ

 

 Perhaps I should be expecting something in the settings?

Re: DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

Explorer

I've adjusted the auth_to_local rules as follows:

 

RULE:[2:$1@$0](HTTP@\QMWS.MDS.XYZ\E$)s/@\QMWS.MDS.XYZ\E$//
RULE:[1:$1@$0](.*@\QMWS.MDS.XYZ\E$)s/@\QMWS.MDS.XYZ\E$///L
RULE:[2:$1@$0](.*@\QMWS.MDS.XYZ\E$)s/@\QMWS.MDS.XYZ\E$///L
RULE:[2:$1@$0](HTTP@\Qmws.mds.xyz\E$)s/@\Qmws.mds.xyz\E$//
RULE:[1:$1@$0](.*@\Qmws.mds.xyz\E$)s/@\Qmws.mds.xyz\E$///L
RULE:[2:$1@$0](.*@\Qmws.mds.xyz\E$)s/@\Qmws.mds.xyz\E$///L
RULE:[2:$1@$0](HTTP@\QMDS.XYZ\E$)s/@\QMDS.XYZ\E$//
RULE:[1:$1@$0](.*@\QMDS.XYZ\E$)s/@\QMDS.XYZ\E$///L
RULE:[2:$1@$0](.*@\QMDS.XYZ\E$)s/@\QMDS.XYZ\E$///L
RULE:[2:$1@$0](HTTP@\Qmds.xyz\E$)s/@\Qmds.xyz\E$//
RULE:[1:$1@$0](.*@\Qmds.xyz\E$)s/@\Qmds.xyz\E$///L
RULE:[2:$1@$0](.*@\Qmds.xyz\E$)s/@\Qmds.xyz\E$///L
DEFAULT

 

And now when I create the folder in this manner, spark-shell starts using the below folder:

 

drwxr-xr-x - tom tom 0 2019-09-04 00:45 /user/tom

Since I have multiple users from multiple domains, collisions can occur if same user exists in two different domains.  So I would be curious if you know how to create the folders in this manner:

/user/<DOMAIN>/<USER>

 

to avoid potential conflict with same named user but in different domains.

Cheers,
TK

 

Re: DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

Explorer

Moving back to the original question:

 

"INFO yarn.SparkRackResolver: Got an error when resolving hostNames. Falling back to /default-rack for all"

 

My configration is as follows:

 

[root@cm-r01en01 conf]# cat log4j.properties
log4j.rootLogger=${root.logger}
root.logger=INFO,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
shell.log.level=INFO
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
log4j.logger.org.apache.spark.repl.Main=${shell.log.level}
log4j.logger.org.apache.spark.api.python.PythonGatewayServer=${shell.log.level}
[root@cm-r01en01 conf]#
[root@cm-r01en01 conf]#
[root@cm-r01en01 conf]# cat spark-defaults.conf
spark.authenticate=false
spark.driver.log.dfsDir=/user/spark/driverLogs
spark.driver.log.persistToDfs.enabled=true
spark.dynamicAllocation.enabled=true
spark.dynamicAllocation.executorIdleTimeout=60
spark.dynamicAllocation.minExecutors=0
spark.dynamicAllocation.schedulerBacklogTimeout=1
spark.eventLog.enabled=true
spark.io.encryption.enabled=false
spark.network.crypto.enabled=false
spark.serializer=org.apache.spark.serializer.KryoSerializer
spark.shuffle.service.enabled=true
spark.shuffle.service.port=7337
spark.ui.enabled=true
spark.ui.killEnabled=true
spark.lineage.log.dir=/var/log/spark/lineage
spark.lineage.enabled=true
spark.master=yarn
spark.submit.deployMode=client
spark.eventLog.dir=hdfs://cm-r01nn02.mws.mds.xyz:8020/user/spark/applicationHistory
spark.yarn.historyServer.address=<a href="http://cm-r01en01.mws.mds.xyz:18088" target="_blank">http://cm-r01en01.mws.mds.xyz:18088</a>
spark.yarn.jars=local:/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/spark/jars/*,local:/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/spark/hive/*
spark.driver.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/hadoop/lib/native
spark.executor.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/hadoop/lib/native
spark.yarn.am.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/hadoop/lib/native
spark.yarn.config.gatewayPath=/opt/cloudera/parcels
spark.yarn.config.replacementPath={{HADOOP_COMMON_HOME}}/../../..
spark.yarn.historyServer.allowTracking=true
spark.yarn.appMasterEnv.MKL_NUM_THREADS=1
spark.executorEnv.MKL_NUM_THREADS=1
spark.yarn.appMasterEnv.OPENBLAS_NUM_THREADS=1
spark.executorEnv.OPENBLAS_NUM_THREADS=1
spark.extraListeners=com.cloudera.spark.lineage.NavigatorAppListener
spark.sql.queryExecutionListeners=com.cloudera.spark.lineage.NavigatorQueryListener
[root@cm-r01en01 conf]#

 

 

I can get into the shell however the console is overwhelmed with the above error messages preventing me from doing anything useful with it.

 

Aiming to run a few spark commands to get started learning it.  

Cheers,
TK

Re: DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

Explorer

Including the log file of the shell session as a link:

https://tinyurl.com/y2kfmke8

 

 

Cheers,
TK

Re: DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

Explorer

Not the best approach to getting rid of these messages but it gave me what I wanted.  I set highest logging level to ERROR instead so everything else is not printed:

tom@mds.xyz@cm-r01en01:~]  $ cat /etc/spark/conf/log4j.properties
log4j.rootLogger=${root.logger}
root.logger=ERROR,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
shell.log.level=ERROR
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
log4j.logger.org.apache.spark.repl.Main=${shell.log.level}
log4j.logger.org.apache.spark.api.python.PythonGatewayServer=${shell.log.level}
tom@mds.xyz@cm-r01en01:~]  $
tom@mds.xyz@cm-r01en01:~]  $
tom@mds.xyz@cm-r01en01:~]  $ digg /etc/spark/conf/log4j.properties /etc/spark/conf/log4j.properties-original
-sh: digg: command not found
tom@mds.xyz@cm-r01en01:~]  $ diff /etc/spark/conf/log4j.properties /etc/spark/conf/log4j.properties-original
2c2
< root.logger=ERROR,console
---
> root.logger=DEBUG,console
10,11c10,11
< log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR
< log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR
---
> log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
> log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
tom@mds.xyz@cm-r01en01:~]  $

 

Now I get my spark-shell without the INFO, DEBUG or WARNING messages all over it.  Still interested in a final solution if possible.  I only see it fixed in Spark 3.0 . 

 

Cheers,
TK