Reply
Highlighted
Explorer
Posts: 10
Registered: ‎02-10-2015

HDFS Service failed to start

Getting below error when trying to start HDFS service.

 

Service did not start successfully; not all of the required roles started: Service has only 0 DataNode roles running instead of minimum required 1.
11:35:09.022 AMFATALorg.apache.hadoop.hdfs.server.datanode.DataNode
Exception in secureMain
java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/rms/dfs/dn/" "/rms/dfs/dn2/" 
	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2319)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2292)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2184)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2231)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2407)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2431)

 

Both directories exists in both datanodes (2 Datanode cluster). Below are the directories and their permission.

 

drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn2
drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn

 

Please provide your solution to resolve this.

Posts: 1,903
Kudos: 435
Solutions: 305
Registered: ‎07-31-2013

Re: HDFS Service failed to start

Your data directories must be owned by the user 'hdfs' and not by 'cloudera-scm'.