Support Questions

Find answers, ask questions, and share your expertise

Datanode fails to start | No enum constant org.apache.hadoop.security.SaslRpcServer.QualityOfProtection.FALSE

avatar
New Contributor

OS WINDOWS11

Java: d:\Java\jdk1.8.0_202

hadoop: d:\hadoop-3.1.3

datanodes : D:\hadoop-3.1.3\data\datanode & D:\hadoop-3.1.3\data\namenode

done "gsudo chwon 777 -R data"

 

gptshubham_0-1642199848278.pnggptshubham_1-1642199873021.png

%PATH%

gptshubham_2-1642199897880.png

I have overwritten bin files from here apache-hadoop-3.1.3-winutils/bin at master · s911415/apache-hadoop-3.1.3-winutils (github.com)

 

hdfs-site.xml

 

<configuration>
  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
  <property>
    <name>dfs.disk.balancer.enabled</name>
    <value>false</value>
</property>
  <property>
    <name>dfs.data.transfer.protection</name>
    <value>false</value>
  </property>
  <property>
    <name>dfs.encrypt.data.transfer</name>
    <value>true</value>
  </property>
  <property>
    <name>dfs.namenode.name.dir</name>
    <value>/D:/hadoop-3.1.3/data/namenode</value>
    <final>true</final>
  </property>
  <property>
    <name>dfs.datanode.data.dir</name>
    <value>/D:/hadoop-3.1.3/data/datanode</value>
    <final>true</final>
  </property>
</configuration>

 

cross-site.xml

 

<configuration>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000</value>
  </property>
  <property>
    <name>hadoop.rpc.protection</name>
    <value>privacy</value>
  </property>
</configuration>

 

mapred-site.xml

 

<configuration>
<property>
    <name>mapreduce.framework.name</name>
    <value>yarn</value>
</property>
</configuration>

 

yarn-site.xml

 

<configuration>
  <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce_shuffle</value>
  </property>
  <property>
    <name>yarn.nodemanager.auxservices.mapreduce.shuffle.class</name>  
    <value>org.apache.hadoop.mapred.ShuffleHandler</value>
  </property>
</configuration>

 

 

namenode starts smoothly and works perfect

 

ERROR GETTING IN DATANODE

 

************************************************************/
2022-01-15 04:02:39,829 INFO checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/D:/hadoop-3.1.3/data/datanode
2022-01-15 04:02:39,915 INFO impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2022-01-15 04:02:39,957 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2022-01-15 04:02:39,957 INFO impl.MetricsSystemImpl: DataNode metrics system started
2022-01-15 04:02:40,482 INFO common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2022-01-15 04:02:40,485 INFO datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576
2022-01-15 04:02:40,489 INFO datanode.DataNode: Configured hostname is SHUBHAM
2022-01-15 04:02:40,489 INFO common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2022-01-15 04:02:40,495 ERROR datanode.DataNode: Exception in secureMain
java.lang.IllegalArgumentException: No enum constant org.apache.hadoop.security.SaslRpcServer.QualityOfProtection.FALSE
        at java.lang.Enum.valueOf(Enum.java:238)
        at org.apache.hadoop.security.SaslRpcServer$QualityOfProtection.valueOf(SaslRpcServer.java:71)
        at org.apache.hadoop.security.SaslPropertiesResolver.setConf(SaslPropertiesResolver.java:68)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
        at org.apache.hadoop.security.SaslPropertiesResolver.getInstance(SaslPropertiesResolver.java:57)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.getSaslPropertiesResolver(DataTransferSaslUtil.java:198)
        at org.apache.hadoop.hdfs.server.datanode.DNConf.<init>(DNConf.java:250)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1375)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:501)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2806)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2714)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2756)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2900)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2924)
2022-01-15 04:02:40,497 INFO util.ExitUtil: Exiting with status 1: java.lang.IllegalArgumentException: No enum constant org.apache.hadoop.security.SaslRpcServer.QualityOfProtection.FALSE
2022-01-15 04:02:40,501 INFO datanode.DataNode: SHUTDOWN_MSG:

 

 

please help me to resolve this issue, saw many tried all nothing is working

 

1 ACCEPTED SOLUTION

avatar
New Contributor
1 REPLY 1

avatar
New Contributor

Removed everything

 

Step by step Hadoop 2.8.0 installation on Window 10 (securityandtechinfo.blogspot.com)

 

Followed this, Its working now