Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Datanode fails to start from command line but starts fine from ambari

avatar

I'm starting the datanode using

/usr/hdp/2.4.2.0-258/hadoop/sbin/hadoop-daemon.sh start datanode

I think there may be an environment variable related, but I also tried to source

. /etc/hadoop/conf/./hadoop-env.sh

Here is the complete error I'm getting:

2016-07-11 18:21:06,436 ERROR datanode.DataNode (DataNode.java:secureMain(2545)) - Exception in secureMain java.lang.RuntimeException: Cannot start secure DataNode without configuring either privileged resources or SASL RPC data transfer protection and SSL for HTTP. Using privileged resources in combination with SASL RPC data transfer protection is not supported. at org.apache.hadoop.hdfs.server.datanode.DataNode.checkSecureConfig(DataNode.java:1217) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1103) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:432) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2423) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2310) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2357) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2538) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2562) 2016-07-11 18:21:06,438 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1 2016-07-11 18:21:06,445 INFO datanode.DataNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG:

1 ACCEPTED SOLUTION

avatar
Expert Contributor

@Felix Albani You will need to provide the configuration file location with --config parameter like Ambari does.

E.g.

hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode

View solution in original post

3 REPLIES 3

avatar
Expert Contributor

@Felix Albani You will need to provide the configuration file location with --config parameter like Ambari does.

E.g.

hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode

avatar
@Xiaoyu Yao

Correct, I was missing the config, thanks. I realize I could of also checked on the Operations Running if I clicked for more details.

Thanks!

avatar

One more thing I want to add is you need to run this using ambari user, in my case root otherwise if I used hdfs user I got same problem. I'm sure this has to do with file permissions.