Support Questions
Find answers, ask questions, and share your expertise

Live datanodes not showing on namenode ui after enabling kerberos in Ambari

Datanodes are not able to communicate with namenode after enabling kerberos in Ambari UI

I have checked clusterID of all datanodes and namenode are identical.

datanode logs,

2018-12-05 16:22:07,147 WARN datanode.DataNode (BPOfferService.java:getBlockPoolId(213)) - Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:213) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:224) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3095) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:71) at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:275) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:338) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:316) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:210) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1604) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:534) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.lang.Thread.run(Thread.java:748)

namenode logs,

construction: 0 2018-12-05 15:33:57,348 INFO block.BlockTokenSecretManager (BlockTokenSecretManager.java:updateKeys(240)) - Updating block keys 2018-12-05 15:33:57,351 INFO hdfs.StateChange (BlockManagerSafeMode.java:reportStatus(602)) - STATE* Safe mode ON. The reported blocks 0 needs additional 3979 blocks to reach the threshold 1.0000 of total blocks 3979. The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached. 2018-12-05 15:33:57,386 INFO ipc.Server (Server.java:run(1314)) - IPC Server Responder: starting 2018-12-05 15:33:57,386 INFO ipc.Server (Server.java:run(1153)) - IPC Server listener on 8020: starting 2018-12-05 15:33:57,436 INFO ipc.Server (Server.java:doRead(1256)) - Socket Reader #1 for port 8020: readAndProcess from client 10.13.10.23:34368 threw exception [org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]] org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at org.apache.hadoop.ipc.Server$Connection.initializeAuthContext(Server.java:2136) at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:2085) at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:1249) at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:1105) at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:1076) 2018-12-05 15:33:57,466 INFO ipc.Server (Server.java:doRead(1256)) - Socket Reader #1 for port 8020: readAndProcess from client 10.13.10.23:34376 threw exception [org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]] org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at org.apache.hadoop.ipc.Server$Connection.initializeAuthContext(Server.java:2136) at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:2085) at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:1249) at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:1105) at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:1076) 2018-12-05 15:33:57,481 INFO namenode.NameNode (NameNode.java:startCommonServices(812)) - NameNode RPC up at: ubuntu19.mcloud.com/10.13.10.19:8020 2018-12-05 15:33:57,483 INFO namenode.FSNamesystem (FSNamesystem.java:startActiveServices(1207)) - Starting services required for active state 2018-12-05 15:33:57,483 INFO namenode.FSDirectory (FSDirectory.java:updateCountForQuota(767)) - Initializing quota with 4 thread(s) 2018-12-05 15:33:57,693 INFO namenode.FSDirectory (FSDirectory.java:updateCountForQuota(776)) - Quota initialization completed in 208 milliseconds name space=36151 storage space=161822274807 storage types=RAM_DISK=0, SSD=0, DISK=3874952379, ARCHIVE=0, PROVIDED=0 2018-12-05 15:33:57,701 INFO blockmanagement.CacheReplicationMonitor (CacheReplicationMonitor.java:run(160)) - Starting CacheReplicationMonitor with interval 30000 milliseconds 2018-12-05 15:33:57,733 INFO ipc.Server (Server.java:authorizeConnection(2562)) - Connection from 10.13.10.22:36473 for protocol org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol is unauthorized for user dn/ubuntu22.mcloud.com@MCLOUD.COM (auth:KERBEROS) 2018-12-05 15:33:57,738 INFO ipc.Server (Server.java:authorizeConnection(2562)) - Connection from 10.13.10.24:46794 for protocol org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol is unauthorized for user dn/ubuntu24.mcloud.com@MCLOUD.COM (auth:KERBEROS) 2018-12-05 15:33:57,739 INFO ipc.Server (Server.java:authorizeConnection(2562)) - Connection from 10.13.10.21:33234 for protocol org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol is unauthorized for user dn/ubuntu21.mcloud.com@MCLOUD.COM (auth:KERBEROS) 2018-12-05 15:33:57,800 INFO ipc.Server (Server.java:authorizeConnection(2562)) - Connection from 10.13.10.20:33021 for protocol org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol is unauthorized for user dn/ubuntu20.mcloud.com@MCLOUD.COM (auth:KERBEROS) 2018-12-05 15:33:57,976 INFO ipc.Server (Server.java:logException(2726)) - IPC Server handler 0 on 8020, call Call#0 Retry#0 org.apache.hadoop.hdfs.protocol.ClientProtocol.setSafeMode from 10.13.10.19:51576 org.apache.hadoop.ipc.RetriableException: NameNode still not started at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkNNStartup(NameNodeRpcServer.java:2210) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setSafeMode(NameNodeRpcServer.java:1223) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setSafeMode(ClientNamenodeProtocolServerSideTranslatorPB.java:846) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) 2018-12-05 15:33:58,127 INFO ipc.Server (Server.java:doRead(1256)) - Socket Reader #1 for port 8020: readAndProcess from client 10.13.10.23:34380 threw exception [org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]] org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at org.apache.hadoop.ipc.Server$Connection.initializeAuthContext(Server.java:2136) at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:2085) at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:1249) at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:1105) at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:1076) 2018-12-05 15:33:58,171 INFO ipc.Server (Server.java:doRead(1256)) - Socket Reader #1 for port 8020: readAndProcess from client 10.13.10.23:34382 threw exception [org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]] org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at org.apache.hadoop.ipc.Server$Connection.initializeAuthContext(Server.java:2136) at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:2085) at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:1249) at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:1105) at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:1076) 2018-12-05 15:33:58,375 INFO fs.TrashPolicyDefault (TrashPolicyDefault.java:<init>(228)) - The configured checkpoint interval is 0 minutes. Using an interval of 360 minutes that is used for deletion instead 2018-12-05 15:33:58,375 INFO fs.TrashPolicyDefault (TrashPolicyDefault.java:<init>(235)) - Namenode trash configuration: Deletion interval = 360 minutes, Emptier interval = 0 minutes.

0 REPLIES 0