<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  / in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/349594#M235697</link>
    <description>&lt;P&gt;&lt;SPAN&gt;Reply might be late but&amp;nbsp;KCM and keyring based Kerberos&amp;nbsp;credentials cache are not supported with hadoop.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;# klist&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Ticket cache: KCM:0:86966&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 05 Aug 2022 05:21:47 GMT</pubDate>
    <dc:creator>npdell</dc:creator>
    <dc:date>2022-08-05T05:21:47Z</dc:date>
    <item>
      <title>After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336045#M232115</link>
      <description>&lt;P&gt;[root@cdp1~]# kinit hdfs&lt;BR /&gt;Password for hdfs@HADOOP.COM:********&lt;BR /&gt;[root@cdp1~]# klist&lt;BR /&gt;Ticket cache: KCM:0:86966&lt;BR /&gt;Default principal: hdfs@HADOOP.COM&lt;/P&gt;
&lt;P&gt;Valid starting Expires Service principal&lt;BR /&gt;2022-02-11T01:52:21 2022-02-12T01:52:21 krbtgt/HADOOP.COM@HADOOP.COM&lt;BR /&gt;renew until 2022-02-18T01:52:21&lt;/P&gt;
&lt;P&gt;[root@cdp1 ~]# hdfs dfs -ls /&lt;BR /&gt;22/02/11 01:53:31 DEBUG util.Shell: setsid exited with exit code 0&lt;BR /&gt;22/02/11 01:53:31 DEBUG conf.Configuration: parsing URL jar:file:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-common-3.1.1.7.1.7.0-551.jar!/core-default.xml&lt;BR /&gt;22/02/11 01:53:31 DEBUG conf.Configuration: parsing input stream sun.net.&lt;A href="http://www.protocol.jar.JarURLConnection$JarURLInputStream@4678c730" target="_blank" rel="noopener"&gt;www.protocol.jar.JarURLConnection$JarURLInputStream@4678c730&lt;/A&gt;&lt;BR /&gt;22/02/11 01:53:32 DEBUG conf.Configuration: parsing URL file:/etc/hadoop/conf.cloudera.yarn/core-site.xml&lt;BR /&gt;22/02/11 01:53:32 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@369f73a2&lt;BR /&gt;22/02/11 01:53:32 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation &lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/148639"&gt;@org&lt;/a&gt;.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])&lt;BR /&gt;22/02/11 01:53:32 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation &lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/148639"&gt;@org&lt;/a&gt;.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])&lt;BR /&gt;22/02/11 01:53:32 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation &lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/148639"&gt;@org&lt;/a&gt;.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])&lt;BR /&gt;22/02/11 01:53:32 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation &lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/148639"&gt;@org&lt;/a&gt;.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Renewal failures since startup])&lt;BR /&gt;22/02/11 01:53:32 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation &lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/148639"&gt;@org&lt;/a&gt;.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Renewal failures since last successful login])&lt;BR /&gt;22/02/11 01:53:32 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.Groups: Creating new Groups object&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: hadoop login&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: hadoop login commit&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: root&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: root" with name root&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: User entry: "root"&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: UGI loginUser:root (auth:SIMPLE)&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: Loading filesystems&lt;BR /&gt;22/02/11 01:53:32 DEBUG gcs.GoogleHadoopFileSystemBase: GHFS version: 2.1.2.7.1.7.0-551&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: gs:// = class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/jars/gcs-connector-2.1.2.7.1.7.0-551-shaded.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: s3n:// = class org.apache.hadoop.fs.s3native.NativeS3FileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-aws-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-common-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-common-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: har:// = class org.apache.hadoop.fs.HarFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-common-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-common-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-common-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: o3fs:// = class org.apache.hadoop.fs.ozone.OzoneFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-ozone-filesystem-hadoop3-1.1.0.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: ofs:// = class org.apache.hadoop.fs.ozone.RootedOzoneFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/hadoop-ozone-filesystem-hadoop3-1.1.0.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/jars/hadoop-hdfs-client-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/jars/hadoop-hdfs-client-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/jars/hadoop-hdfs-client-3.1.1.7.1.7.0-551.jar&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: Looking for FS supporting hdfs&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: looking for configuration option fs.hdfs.impl&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: Looking in service filesystems for implementation class&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.FileSystem: FS for hdfs is class org.apache.hadoop.hdfs.DistributedFileSystem&lt;BR /&gt;22/02/11 01:53:32 DEBUG impl.DfsClientConf: dfs.client.use.legacy.blockreader.local = false&lt;BR /&gt;22/02/11 01:53:32 DEBUG impl.DfsClientConf: dfs.client.read.shortcircuit = true&lt;BR /&gt;22/02/11 01:53:32 DEBUG impl.DfsClientConf: dfs.client.domain.socket.data.traffic = false&lt;BR /&gt;22/02/11 01:53:32 DEBUG impl.DfsClientConf: dfs.domain.socket.path = /var/run/hdfs-sockets/dn&lt;BR /&gt;22/02/11 01:53:32 DEBUG hdfs.DFSClient: Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0&lt;BR /&gt;22/02/11 01:53:32 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null&lt;BR /&gt;22/02/11 01:53:32 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@63a12c68&lt;BR /&gt;22/02/11 01:53:32 DEBUG ipc.Client: getting client out of cache: Client-25ed7ec2740446f896ca1edf51121c24&lt;BR /&gt;22/02/11 01:53:32 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...&lt;BR /&gt;22/02/11 01:53:32 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library&lt;BR /&gt;22/02/11 01:53:32 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@22cadff5: starting with interruptCheckPeriodMs = 60000&lt;BR /&gt;22/02/11 01:53:32 DEBUG shortcircuit.DomainSocketFactory: The short-circuit local reads feature is enabled.&lt;BR /&gt;22/02/11 01:53:32 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.Globber: Created Globber for path=/, symlinks=true&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.Globber: Starting: glob /&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.Globber: Filesystem glob /&lt;BR /&gt;22/02/11 01:53:32 DEBUG fs.Globber: Pattern: /&lt;BR /&gt;22/02/11 01:53:32 DEBUG ipc.Client: The ping interval is 60000 ms.&lt;BR /&gt;22/02/11 01:53:32 DEBUG ipc.Client: Connecting to cdp1.localdomain/192.168.159.20:8020&lt;BR /&gt;22/02/11 01:53:32 DEBUG ipc.Client: Setup connection to cdp1.localdomain/192.168.159.20:8020&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:818)&lt;BR /&gt;22/02/11 01:53:33 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE&lt;/P&gt;
&lt;P&gt;22/02/11 01:53:33 DEBUG security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)&lt;BR /&gt;22/02/11 01:53:33 DEBUG security.SaslRpcClient: tokens aren't supported for this protocol or user doesn't have one&lt;BR /&gt;22/02/11 01:53:33 DEBUG security.SaslRpcClient: client isn't using kerberos&lt;BR /&gt;22/02/11 01:53:33 DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;22/02/11 01:53:33 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:741)&lt;BR /&gt;22/02/11 01:53:33 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;22/02/11 01:53:33 DEBUG security.UserGroupInformation: PrivilegedActionException as:root (auth:SIMPLE) cause:java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;22/02/11 01:53:33 DEBUG ipc.Client: closing ipc connection to cdp1.localdomain/192.168.159.20:8020: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:778)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:741)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:835)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:413)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.getConnection(Client.java:1636)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1452)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1405)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)&lt;BR /&gt;at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:957)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:431)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:166)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:158)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:96)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:362)&lt;BR /&gt;at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1693)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1745)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1742)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1757)&lt;BR /&gt;at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:115)&lt;BR /&gt;at org.apache.hadoop.fs.Globber.doGlob(Globber.java:362)&lt;BR /&gt;at org.apache.hadoop.fs.Globber.glob(Globber.java:202)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:2103)&lt;BR /&gt;at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:353)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:250)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:233)&lt;BR /&gt;at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:104)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.run(Command.java:177)&lt;BR /&gt;at org.apache.hadoop.fs.FsShell.run(FsShell.java:328)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)&lt;BR /&gt;at org.apache.hadoop.fs.FsShell.main(FsShell.java:391)&lt;BR /&gt;Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)&lt;BR /&gt;at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:622)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:413)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:822)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:818)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:818)&lt;BR /&gt;... 36 more&lt;BR /&gt;22/02/11 01:53:33 DEBUG ipc.Client: IPC Client (169663597) connection to cdp1.localdomain/192.168.159.20:8020 from root: closed&lt;BR /&gt;22/02/11 01:53:33 DEBUG retry.RetryInvocationHandler: Exception while invoking call #0 ClientNamenodeProtocolTranslatorPB.getFileInfo over null. Not retrying because try once and fail.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I don't understand why it's still the root user here：&lt;/P&gt;
&lt;P&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: hadoop login&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: hadoop login commit&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: root&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: root" with name root&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: User entry: "root"&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: UGI loginUser:root (auth:SIMPLE)&lt;/P&gt;</description>
      <pubDate>Tue, 21 Apr 2026 08:01:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336045#M232115</guid>
      <dc:creator>Mayarnzero</dc:creator>
      <dc:date>2026-04-21T08:01:15Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336090#M232131</link>
      <description>&lt;P&gt;This looks like your HDFS service is misconfigured.&lt;/P&gt;&lt;P&gt;Are you using CDP or open-source HDFS?&lt;/P&gt;&lt;P&gt;Could you please share your HDFS configuration, specifically the properties that you set to enable Kerberos?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;André&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 11 Feb 2022 12:32:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336090#M232131</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-02-11T12:32:18Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336149#M232155</link>
      <description>&lt;P&gt;Thank you. This is cdp7 1.7 and CM7 4.4. The system is RedHat 8.2.&lt;/P&gt;&lt;P&gt;I started Kerberos very smoothly. I can also authenticate, but it didn't work.&lt;/P&gt;&lt;P&gt;Moreover, not only HDFS, hive and other components cannot succeed.&lt;/P&gt;&lt;P&gt;I think the Kerberos configuration of HDFS components is carried out by CDP. I also saw that these components have changed the relevant configuration in the CM web interface.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The following is / etc / krb5.conf:&lt;/P&gt;&lt;P&gt;[logging]&lt;/P&gt;&lt;P&gt;default = FILE:/var/log/krb5libs. log&lt;/P&gt;&lt;P&gt;kdc = FILE:/var/log/krb5kdc. log&lt;/P&gt;&lt;P&gt;admin_ server = FILE:/var/log/kadmind. log&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[libdefaults]&lt;/P&gt;&lt;P&gt;default_ realm = HADOOP. COM&lt;/P&gt;&lt;P&gt;ticket_ lifetime = 24h&lt;/P&gt;&lt;P&gt;renew_ lifetime = 7d&lt;/P&gt;&lt;P&gt;forwardable = true&lt;/P&gt;&lt;P&gt;renewable = true&lt;/P&gt;&lt;P&gt;rdns = false&lt;/P&gt;&lt;P&gt;udp_ prefrence_ limit=0&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[realms]&lt;/P&gt;&lt;P&gt;HADOOP. COM = {&lt;/P&gt;&lt;P&gt;kdc = cdp1. localdomain&lt;/P&gt;&lt;P&gt;admin_ server = cdp1. localdomain&lt;/P&gt;&lt;P&gt;}&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[domain_realm]&lt;/P&gt;&lt;P&gt;.hadoop. com = HADOOP. COM&lt;/P&gt;&lt;P&gt;hadoop. com = HADOOP. COM&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The following is / etc / hadoop/conf/hdfs-site.xml:&lt;BR /&gt;&amp;lt;property&amp;gt;&lt;BR /&gt;&amp;lt;name&amp;gt;dfs.namenode.kerberos.principal&amp;lt;/name&amp;gt;&lt;BR /&gt;&amp;lt;value&amp;gt;hdfs/_HOST@HADOOP.COM&amp;lt;/value&amp;gt;&lt;BR /&gt;&amp;lt;/property&amp;gt;&lt;BR /&gt;&amp;lt;property&amp;gt;&lt;BR /&gt;&amp;lt;name&amp;gt;dfs.namenode.kerberos.internal.spnego.principal&amp;lt;/name&amp;gt;&lt;BR /&gt;&amp;lt;value&amp;gt;HTTP/_HOST@HADOOP.COM&amp;lt;/value&amp;gt;&lt;BR /&gt;&amp;lt;/property&amp;gt;&lt;BR /&gt;&amp;lt;property&amp;gt;&lt;BR /&gt;&amp;lt;name&amp;gt;dfs.datanode.kerberos.principal&amp;lt;/name&amp;gt;&lt;BR /&gt;&amp;lt;value&amp;gt;hdfs/_HOST@HADOOP.COM&amp;lt;/value&amp;gt;&lt;BR /&gt;&amp;lt;/property&amp;gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 12 Feb 2022 04:25:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336149#M232155</guid>
      <dc:creator>Mayarnzero</dc:creator>
      <dc:date>2022-02-12T04:25:34Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336155#M232156</link>
      <description>&lt;P&gt;Try changing&amp;nbsp;&lt;SPAN&gt;udp_preference_limit to 1 in the krb5.conf file on all the hosts and restart your cluster.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Also notice that you have a typo in that parameter's name. The correct is&amp;nbsp;udp_preference_limit and not&amp;nbsp;udp_prefrence_limit.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 12 Feb 2022 13:06:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336155#M232156</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-02-12T13:06:09Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336157#M232158</link>
      <description>&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;I made these changes, but it doesn't seem to be the reason.&lt;/P&gt;&lt;P&gt;I'm new to CDP, maybe I made a more obvious mistake,the obvious error is that the users loaded here are wrong, but I don't understand why?It should be 'hdfs', not 'root'.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: root&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: root" with name root&lt;BR /&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: User entry: "root"&lt;/P&gt;&lt;P&gt;22/02/11 01:53:32 DEBUG security.UserGroupInformation: UGI loginUser:root (auth:SIMPLE)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you can guess from here that it is the configuration problem of components. In fact, when Kerberos is enabled on CDP, the configuration modification of components is transparent to me. If there is no special need, it seems that I don't need to care.&lt;/P&gt;</description>
      <pubDate>Sat, 12 Feb 2022 14:39:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336157#M232158</guid>
      <dc:creator>Mayarnzero</dc:creator>
      <dc:date>2022-02-12T14:39:09Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336159#M232160</link>
      <description>&lt;P&gt;Which steps did you take to enable Kerberos? Did you use the wizard in Cloudera Manager?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How many nodes does your cluster have?&lt;/P&gt;&lt;P&gt;Which node are you running these commands from? Have you tried from others nodes (e.g have you tried from the Name node host?)&lt;/P&gt;</description>
      <pubDate>Sat, 12 Feb 2022 19:03:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336159#M232160</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-02-12T19:03:38Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336167#M232167</link>
      <description>&lt;P&gt;Yes, I use the CDP wizard. And the command is run on namenode. It has also been tried at other nodes, and the same problem will occur.&lt;/P&gt;&lt;P&gt;My test cluster has only four nodes, and I have a cdp7.1.5 cluster on Centos7.9, which did not encounter this problem when opening Kerberos.&lt;/P&gt;&lt;P&gt;I will try to reinstall cdp7.1.5 on RedHat 8 to see if this problem will occur. Thank you for your help.&lt;/P&gt;</description>
      <pubDate>Sun, 13 Feb 2022 05:19:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/336167#M232167</guid>
      <dc:creator>Mayarnzero</dc:creator>
      <dc:date>2022-02-13T05:19:58Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/349594#M235697</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Reply might be late but&amp;nbsp;KCM and keyring based Kerberos&amp;nbsp;credentials cache are not supported with hadoop.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;# klist&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Ticket cache: KCM:0:86966&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 05 Aug 2022 05:21:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/349594#M235697</guid>
      <dc:creator>npdell</dc:creator>
      <dc:date>2022-08-05T05:21:47Z</dc:date>
    </item>
    <item>
      <title>Re: After Kerberos is enabled, HDFS authentication still cannot use any commands  like  hdfs dfs  -ls  /</title>
      <link>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/386671#M246110</link>
      <description>&lt;P&gt;Is it solved?&lt;/P&gt;</description>
      <pubDate>Wed, 17 Apr 2024 03:01:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/After-Kerberos-is-enabled-HDFS-authentication-still-cannot/m-p/386671#M246110</guid>
      <dc:creator>lslzz</dc:creator>
      <dc:date>2024-04-17T03:01:41Z</dc:date>
    </item>
  </channel>
</rss>

