Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

unable to start the data node in hdp3.1 amabri 2.7 and amazon2 linux

unable to start the data node in hdp3.1 amabri 2.7 and amazon2 linux

New Contributor

Gettting the below error while starting the datanode .

 

oaded.
2019-11-06 13:56:26,507 INFO datanode.DataNode (DataNode.java:<init>(499)) - Configured hostname is hadoop03.prod.phenom.local
2019-11-06 13:56:26,507 INFO common.Util (Util.java:isDiskStatsEnabled(395)) - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2019-11-06 13:56:26,511 INFO datanode.DataNode (DataNode.java:startDataNode(1399)) - Starting DataNode with maxLockedMemory = 0
2019-11-06 13:56:26,529 INFO datanode.DataNode (DataNode.java:initDataXceiver(1147)) - Opened streaming server at /0.0.0.0:50010
2019-11-06 13:56:26,530 INFO datanode.DataNode (DataXceiverServer.java:<init>(78)) - Balancing bandwidth is 6250000 bytes/s
2019-11-06 13:56:26,531 INFO datanode.DataNode (DataXceiverServer.java:<init>(79)) - Number threads for balancing is 50
2019-11-06 13:56:26,531 ERROR datanode.DataNode (DataNode.java:secureMain(2883)) - Exception in secureMain
java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be loaded.
at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:1192)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:1161)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1416)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:500)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2782)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2690)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2732)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2876)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2900)
2019-11-06 13:56:26,533 INFO util.ExitUtil (ExitUtil.java:terminate(210)) - Exiting with status 1: java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be loaded.
2019-11-06 13:56:26,535 INFO datanode.DataNode (LogAdapter.java:info(51)) - SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at hadoop03.prod.phenom.local/172.25.58.155
************************************************************/

i am able to start the name node in the same server. 

4 REPLIES 4

Re: unable to start the data node in hdp3.1 amabri 2.7 and amazon2 linux

Expert Contributor
ERROR datanode.DataNode (DataNode.java:secureMain(2883)) - Exception in secureMain
java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be loaded.

 

Above error occurs when libhadoop.so library is missing in /usr/hdp/current/hadoop/lib/native/

To resolve this issue copy the missing library from another healthy node

@Muralikrishna 

 

Re: unable to start the data node in hdp3.1 amabri 2.7 and amazon2 linux

New Contributor

@Scharan  Thanks for your support .

 

libhadoop.so libfile is already there in the hadoop native directory moreover its started the process earllier after the reboot the instance i am getting the error 

 

[murali.kumpatla@hadoop03 native]$ ls -lrt
total 2548
-rwxr-xr-x 1 root root 1040144 Aug 23 05:31 libhadoop.so.1.0.0
-rw-r--r-- 1 root root 910460 Aug 23 05:31 libnativetask.a
-rw-r--r-- 1 root root 107072 Aug 23 05:31 libhdfs.a
-rw-r--r-- 1 root root 54302 Aug 23 05:31 libhadooputils.a
-rw-r--r-- 1 root root 188076 Aug 23 05:31 libhadooppipes.a
-rw-r--r-- 1 root root 270842 Aug 23 05:31 libhadoop.a
-rwxr-xr-x 1 root root 23736 Aug 23 05:31 libsnappy.so.1.1.4
lrwxrwxrwx 1 root root 18 Oct 16 09:01 libhadoop.so -> libhadoop.so.1.0.0
lrwxrwxrwx 1 root root 18 Oct 16 09:01 libsnappy.so.1 -> libsnappy.so.1.1.4
lrwxrwxrwx 1 root root 18 Oct 16 09:01 libsnappy.so -> libsnappy.so.1.1.4

Highlighted

Re: unable to start the data node in hdp3.1 amabri 2.7 and amazon2 linux

Expert Contributor

can you share the output of below command

 

# hadoop checknative -a | grep hadoop

 

Re: unable to start the data node in hdp3.1 amabri 2.7 and amazon2 linux

New Contributor

@Scharan  Please find the output 

 

[murali.kumpatla@hadoop03 ~]$ hadoop checknative -a | grep hadoop
19/11/06 16:47:13 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
19/11/06 16:47:13 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
19/11/06 16:47:13 WARN zstd.ZStandardCompressor: Error loading zstandard native libraries: java.lang.InternalError: Cannot load libzstd.so.1 (libzstd.so.1: cannot open shared object file: No such file or directory)!
19/11/06 16:47:13 WARN erasurecode.ErasureCodeNative: Loading ISA-L failed: Failed to load libisal.so.2 (libisal.so.2: cannot open shared object file: No such file or directory)
19/11/06 16:47:13 WARN erasurecode.ErasureCodeNative: ISA-L support is not available in your platform... using builtin-java codec where applicable
hadoop: true /usr/hdp/3.1.4.0-315/hadoop/lib/native/libhadoop.so.1.0.0
snappy: true /usr/hdp/3.1.4.0-315/hadoop/lib/native/libsnappy.so.1
19/11/06 16:47:13 INFO util.ExitUtil: Exiting with status 1: ExitException
[murali.kumpatla@hadoop03 ~]$

Don't have an account?
Coming from Hortonworks? Activate your account here