Created on 09-05-2015 04:06 AM - edited 09-16-2022 02:40 AM
Hi,
I am using Cloudera CDH 5.4.4. I have deployed HDFS and HBASE in multi-cluster node. I am using RedHat Linux 2.6.32 x86_64 machine and JDK 1.8
Everytime I run any command example hbsae shell or hadoop fs -ls , I get the below exception:
Java HotSpot(TM) Server VM warning: You have loaded library /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
15/09/05 06:41:10 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS64 (Possible cause: architecture word width mismatch)
15/09/05 06:41:10 DEBUG util.NativeCodeLoader: java.library.path=/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/native
15/09/05 06:41:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
As recommended in lot of posts and hadoop wiki (http://archive.cloudera.com/cdh5/cdh/5/hbase-0.98.6-cdh5.2.5/book/hadoop.native.lib.html), I have tried setting below env variables in hbase-env.sh but still get the same exception.
export HBASE_LIBRARY_PATH=/usr/lib/hadoop/lib/native
export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native
This issue is also not about using 32-bit native library with a 64-bit JVM as has been suggested in quite a number of posts related to this issue.
I ran: ldd /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0
Output:
linux-vdso.so.1 => (0x00007fffce44d000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f8e09e88000)
libjvm.so => not found
libc.so.6 => /lib64/libc.so.6 (0x00007f8e09af3000)
/lib64/ld-linux-x86-64.so.2 (0x00007f8e0a2b7000)
Suggesting the native library is 64-bit.
Even when I run the org.apache.hadoop.util.NativeLibraryChecker utility, I get the below output:
I run the command :
/usr/bin/hbase --config /etc/hbase/conf org.apache.hadoop.util.NativeLibraryChecker
Output:
HADOOP_CONF: /etc/hadoop/conf
HADOOP_HOME: /usr/lib/hadoop
ZOOKEEPER_HOME: /usr/lib/zookeeper
HBASE_CLASSPATH: /etc/hadoop/conf:/usr/lib/hadoop/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/lib/native/*:/usr/lib/zookeeper/*:/usr/lib/zookeeper/lib/*:
15/09/05 06:52:04 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
Java HotSpot(TM) Server VM warning: You have loaded library /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
15/09/05 06:52:04 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS64 (Possible cause: architecture word width mismatch)
15/09/05 06:52:04 DEBUG util.NativeCodeLoader: java.library.path=/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/native
15/09/05 06:52:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
Whenever I the run the same command on my demo cloudera quickstart VM (CDH 5.3.0), on which I dont get this hadoop native exception, I get a different ouput
HADOOP_CONF: /etc/hadoop/conf
HADOOP_HOME: /usr/lib/hadoop
ZOOKEEPER_HOME: /usr/lib/zookeeper
HBASE_CLASSPATH: /etc/hadoop/conf:/usr/lib/hadoop/*:/usr/lib/hadoop/lib/*:/usr/lib/zookeeper/*:/usr/lib/zookeeper/lib/*:
15/09/05 03:53:54 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
15/09/05 03:53:54 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
15/09/05 03:53:54 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
15/09/05 03:53:54 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0
zlib: true /lib64/libz.so.1
snappy: true /usr/lib/hadoop/lib/native/libsnappy.so.1
lz4: true revision:99
bzip2: true /lib64/libbz2.so.1
openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared object file: No such file or directory)!
Please can anyone suggest what is that I need to do overcome this error. This issue is creating problems while trying to create hbase tables with snappy compression for me.
I have already spent 2 days going over this issue without any luck.
Any help would be appreciated.
regards,
Varun
Created 09-09-2015 04:16 PM
Created 09-09-2015 04:16 PM
Created 03-15-2018 06:53 AM
exactly, verify you jdk installation. download again the correct version for de OS, install and try again
Created 06-15-2018 01:47 AM
Created 06-15-2018 07:57 AM