- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Datanode fails to start from command line but starts fine from ambari
- Labels:
-
Apache Hadoop
Created ‎07-11-2016 10:23 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm starting the datanode using
/usr/hdp/2.4.2.0-258/hadoop/sbin/hadoop-daemon.sh start datanode
I think there may be an environment variable related, but I also tried to source
. /etc/hadoop/conf/./hadoop-env.sh
Here is the complete error I'm getting:
2016-07-11 18:21:06,436 ERROR datanode.DataNode (DataNode.java:secureMain(2545)) - Exception in secureMain java.lang.RuntimeException: Cannot start secure DataNode without configuring either privileged resources or SASL RPC data transfer protection and SSL for HTTP. Using privileged resources in combination with SASL RPC data transfer protection is not supported. at org.apache.hadoop.hdfs.server.datanode.DataNode.checkSecureConfig(DataNode.java:1217) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1103) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:432) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2423) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2310) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2357) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2538) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2562) 2016-07-11 18:21:06,438 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1 2016-07-11 18:21:06,445 INFO datanode.DataNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG:
Created ‎07-11-2016 10:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Felix Albani You will need to provide the configuration file location with --config parameter like Ambari does.
E.g.
hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode
Created ‎07-11-2016 10:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Felix Albani You will need to provide the configuration file location with --config parameter like Ambari does.
E.g.
hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode
Created ‎07-11-2016 11:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Correct, I was missing the config, thanks. I realize I could of also checked on the Operations Running if I clicked for more details.
Thanks!
Created ‎07-11-2016 11:06 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
One more thing I want to add is you need to run this using ambari user, in my case root otherwise if I used hdfs user I got same problem. I'm sure this has to do with file permissions.
