Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDFS Service failed to start


HDFS Service failed to start


Getting below error when trying to start HDFS service.


Service did not start successfully; not all of the required roles started: Service has only 0 DataNode roles running instead of minimum required 1.
11:35:09.022 AMFATALorg.apache.hadoop.hdfs.server.datanode.DataNode
Exception in secureMain All directories in are invalid: "/rms/dfs/dn/" "/rms/dfs/dn2/" 
	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
	at org.apache.hadoop.hdfs.server.datanode.DataNode.main(


Both directories exists in both datanodes (2 Datanode cluster). Below are the directories and their permission.


drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn2
drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn


Please provide your solution to resolve this.


Re: HDFS Service failed to start

Master Guru
Your data directories must be owned by the user 'hdfs' and not by 'cloudera-scm'.