Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Can't start datanode : Running in secure mode, but config doesn't have a keytab

avatar
New Contributor

Can't start datanode 

-----------------------------------------------------------------------------------------------------------------------------------

STARTUP_MSG: java = 1.8.0_312
************************************************************/
2022-05-03 08:42:09,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2022-05-03 08:42:09,809 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: Running in secure mode, but config doesn't have a keytab
at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:239)
at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:210)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2259)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2308)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2485)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2509)
2022-05-03 08:42:09,812 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2022-05-03 08:42:09,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at dn1.pfe-master/192.168.198.160

3 REPLIES 3

avatar
Super Collaborator

1. Check if you are able to kinit with the HDFS keytab on the Datanode host.

2. Check if the permissions are correct for the keytab and process dir.

3. If CDH, try to do a hard restart of the cloudera-scm agent. (This require all processes on the host managed by CM to be stopped)

service cloudera-scm-agent hard_restart_confirmed

 

avatar
New Contributor

hello @rki_ 

I change the hdfs file by adding some property related to kerberos and i found a new error related to SSL .

 

STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r baa91f7c6bc9cb92be5982de4719c1c8af91ccff; compiled by 'root' on 2016-08-18T01:41Z
STARTUP_MSG: java = 1.8.0_312
************************************************************/
2022-05-04 16:17:37,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2022-05-04 16:17:38,361 INFO org.apache.hadoop.security.UserGroupInformation: Login successful for user nn/nn.pfe-master@ABDOU.COM using keytab file /opt/keytabs/nn.service.keytab
2022-05-04 16:17:38,510 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2022-05-04 16:17:38,607 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2022-05-04 16:17:38,607 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2022-05-04 16:17:38,611 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576
2022-05-04 16:17:38,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dn1.pfe-master
2022-05-04 16:17:38,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2022-05-04 16:17:38,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:1055
2022-05-04 16:17:38,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 1048576 bytes/s
2022-05-04 16:17:38,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 5
2022-05-04 16:17:38,696 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2022-05-04 16:17:38,702 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2022-05-04 16:17:38,705 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
2022-05-04 16:17:38,710 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2022-05-04 16:17:38,712 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
2022-05-04 16:17:38,713 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2022-05-04 16:17:38,714 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2022-05-04 16:17:38,723 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 43917
2022-05-04 16:17:38,723 INFO org.mortbay.log: jetty-6.1.26
2022-05-04 16:17:38,844 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:43917
2022-05-04 16:17:38,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Shutdown complete.
2022-05-04 16:17:38,907 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.FileNotFoundException: /opt/jks/truststore.jks (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at org.apache.hadoop.security.ssl.ReloadingX509TrustManager.loadTrustManager(ReloadingX509TrustManager.java:164)
at org.apache.hadoop.security.ssl.ReloadingX509TrustManager.<init>(ReloadingX509TrustManager.java:81)
at org.apache.hadoop.security.ssl.FileBasedKeyStoresFactory.init(FileBasedKeyStoresFactory.java:209)
at org.apache.hadoop.security.ssl.SSLFactory.init(SSLFactory.java:131)
at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:149)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:760)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1112)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:429)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2374)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2261)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2308)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2485)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2509)
2022-05-04 16:17:38,909 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2022-05-04 16:17:38,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at dn1.pfe-master/192.168.198.160

 

avatar
Super Collaborator

Make sure you have the truststore.jks file present under /opt/jks/truststore.jks and has the correct permissions. You can compare it with any of the working Datanode.