Posts: 10
Registered: ‎02-10-2015

HDFS Service failed to start

Getting below error when trying to start HDFS service.


Service did not start successfully; not all of the required roles started: Service has only 0 DataNode roles running instead of minimum required 1.
11:35:09.022 AMFATALorg.apache.hadoop.hdfs.server.datanode.DataNode
Exception in secureMain All directories in are invalid: "/rms/dfs/dn/" "/rms/dfs/dn2/" 
	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.main(


Both directories exists in both datanodes (2 Datanode cluster). Below are the directories and their permission.


drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn2
drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn


Please provide your solution to resolve this.

Posts: 1,903
Kudos: 435
Solutions: 307
Registered: ‎07-31-2013

Re: HDFS Service failed to start

Your data directories must be owned by the user 'hdfs' and not by 'cloudera-scm'.