28798
DISCUSSIONS
102175
MEMBERS
3161
ARTICLES
Getting below error when trying to start HDFS service.
Service did not start successfully; not all of the required roles started: Service has only 0 DataNode roles running instead of minimum required 1.
11:35:09.022 AM | FATAL | org.apache.hadoop.hdfs.server.datanode.DataNode | Exception in secureMain java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/rms/dfs/dn/" "/rms/dfs/dn2/" at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2319) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2292) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2184) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2231) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2407) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2431) |
Both directories exists in both datanodes (2 Datanode cluster). Below are the directories and their permission.
drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn2
drwxrwxrwx 3 cloudera-scm cloudera-scm 4096 Mar 1 12:31 dn
Please provide your solution to resolve this.