Support Questions

Find answers, ask questions, and share your expertise

Failed to load native-hadoop with error:libhadoop.so.1.0.0:wrong elf class : ELFCLASS64

avatar
Explorer

 

Hi,

 

I am using Cloudera CDH 5.4.4. I have deployed HDFS and HBASE in multi-cluster node. I am using RedHat Linux 2.6.32 x86_64 machine and JDK 1.8

Everytime I run any command example hbsae shell or hadoop fs -ls , I get the below exception:

 

Java HotSpot(TM) Server VM warning: You have loaded library /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
15/09/05 06:41:10 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS64 (Possible cause: architecture word width mismatch)
15/09/05 06:41:10 DEBUG util.NativeCodeLoader: java.library.path=/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/native
15/09/05 06:41:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

 

As recommended in lot of posts and hadoop wiki (http://archive.cloudera.com/cdh5/cdh/5/hbase-0.98.6-cdh5.2.5/book/hadoop.native.lib.html), I have tried setting below env variables in hbase-env.sh but still get the same exception.

export HBASE_LIBRARY_PATH=/usr/lib/hadoop/lib/native

export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native

This issue is also not about using 32-bit native library with a 64-bit JVM as has been suggested in quite a number of posts related to this issue.

I ran:   ldd /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0

Output:

linux-vdso.so.1 =>  (0x00007fffce44d000)
        libdl.so.2 => /lib64/libdl.so.2 (0x00007f8e09e88000)
        libjvm.so => not found
        libc.so.6 => /lib64/libc.so.6 (0x00007f8e09af3000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f8e0a2b7000)

 

Suggesting the native library is 64-bit.

 

Even when I run the org.apache.hadoop.util.NativeLibraryChecker utility, I get the below output:

I run the command :

/usr/bin/hbase --config /etc/hbase/conf org.apache.hadoop.util.NativeLibraryChecker

 

Output:

HADOOP_CONF: /etc/hadoop/conf
HADOOP_HOME: /usr/lib/hadoop
ZOOKEEPER_HOME: /usr/lib/zookeeper
HBASE_CLASSPATH: /etc/hadoop/conf:/usr/lib/hadoop/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/lib/native/*:/usr/lib/zookeeper/*:/usr/lib/zookeeper/lib/*:
15/09/05 06:52:04 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
Java HotSpot(TM) Server VM warning: You have loaded library /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
15/09/05 06:52:04 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS64 (Possible cause: architecture word width mismatch)
15/09/05 06:52:04 DEBUG util.NativeCodeLoader: java.library.path=/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/native
15/09/05 06:52:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop:  false
zlib:    false
snappy:  false
lz4:     false
bzip2:   false
openssl: false

 

 

Whenever I the run the same command on my demo cloudera quickstart VM (CDH 5.3.0), on which I dont get this hadoop native exception, I get a different ouput

 

HADOOP_CONF: /etc/hadoop/conf
HADOOP_HOME: /usr/lib/hadoop
ZOOKEEPER_HOME: /usr/lib/zookeeper
HBASE_CLASSPATH: /etc/hadoop/conf:/usr/lib/hadoop/*:/usr/lib/hadoop/lib/*:/usr/lib/zookeeper/*:/usr/lib/zookeeper/lib/*:
15/09/05 03:53:54 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
15/09/05 03:53:54 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
15/09/05 03:53:54 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
15/09/05 03:53:54 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop:  true /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0
zlib:    true /lib64/libz.so.1
snappy:  true /usr/lib/hadoop/lib/native/libsnappy.so.1
lz4:     true revision:99
bzip2:   true /lib64/libbz2.so.1
openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared object file: No such file or directory)!

 

Please can anyone suggest what is that I need to do overcome this error. This issue is creating problems while trying to create hbase tables with snappy compression for me.

I have already spent 2 days going over this issue without any luck.

 

Any help would be appreciated.

 

regards,

Varun

 

1 ACCEPTED SOLUTION

avatar
Mentor
You have installed the wrong Java JDK8 package. Please ensure to download the 64-bit JDK8 and remove your current 32-bit JDK8.

A 64-bit JDK8 will print the below, for example, if you'd like to check and compare with your $JAVA_HOME/bin/java executable:

~ java -version
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

View solution in original post

4 REPLIES 4

avatar
Mentor
You have installed the wrong Java JDK8 package. Please ensure to download the 64-bit JDK8 and remove your current 32-bit JDK8.

A 64-bit JDK8 will print the below, for example, if you'd like to check and compare with your $JAVA_HOME/bin/java executable:

~ java -version
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

avatar
Explorer

exactly, verify you jdk installation. download again the correct version for de OS, install and try again

avatar
New Contributor
I have a 64-bit version of JDK 1.8 on my Windows 7. But on installing Hadoop 2.7.1, when I run hadoop checknative -a on the cmd, it prompts that the zlib, snappy, bzip2 and openssl are false, while the other native libraries are true. Please help. FYI, I have openssl 1.0.2 installed on my machine.

Regards
Shaswata

avatar
Expert Contributor
Cloudera does not ship Apache Hadoop. That said, the Apache Hadoop convenience binary does not ship with windows native libraries, only Linux ones.