Member since
08-17-2015
5
Posts
0
Kudos Received
0
Solutions
09-05-2015
04:06 AM
Hi, I am using Cloudera CDH 5.4.4. I have deployed HDFS and HBASE in multi-cluster node. I am using RedHat Linux 2.6.32 x86_64 machine and JDK 1.8 Everytime I run any command example hbsae shell or hadoop fs -ls , I get the below exception: Java HotSpot(TM) Server VM warning: You have loaded library /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. 15/09/05 06:41:10 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS64 (Possible cause: architecture word width mismatch) 15/09/05 06:41:10 DEBUG util.NativeCodeLoader: java.library.path=/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/native 15/09/05 06:41:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable As recommended in lot of posts and hadoop wiki (http://archive.cloudera.com/cdh5/cdh/5/hbase-0.98.6-cdh5.2.5/book/hadoop.native.lib.html), I have tried setting below env variables in hbase-env.sh but still get the same exception. export HBASE_LIBRARY_PATH=/usr/lib/hadoop/lib/native export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native This issue is also not about using 32-bit native library with a 64-bit JVM as has been suggested in quite a number of posts related to this issue. I ran: ldd /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 Output: linux-vdso.so.1 => (0x00007fffce44d000) libdl.so.2 => /lib64/libdl.so.2 (0x00007f8e09e88000) libjvm.so => not found libc.so.6 => /lib64/libc.so.6 (0x00007f8e09af3000) /lib64/ld-linux-x86-64.so.2 (0x00007f8e0a2b7000) Suggesting the native library is 64-bit. Even when I run the org.apache.hadoop.util.NativeLibraryChecker utility, I get the below output: I run the command : /usr/bin/hbase --config /etc/hbase/conf org.apache.hadoop.util.NativeLibraryChecker Output: HADOOP_CONF: /etc/hadoop/conf HADOOP_HOME: /usr/lib/hadoop ZOOKEEPER_HOME: /usr/lib/zookeeper HBASE_CLASSPATH: /etc/hadoop/conf:/usr/lib/hadoop/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/lib/native/*:/usr/lib/zookeeper/*:/usr/lib/zookeeper/lib/*: 15/09/05 06:52:04 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library... Java HotSpot(TM) Server VM warning: You have loaded library /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. 15/09/05 06:52:04 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS64 (Possible cause: architecture word width mismatch) 15/09/05 06:52:04 DEBUG util.NativeCodeLoader: java.library.path=/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/native 15/09/05 06:52:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Native library checking: hadoop: false zlib: false snappy: false lz4: false bzip2: false openssl: false Whenever I the run the same command on my demo cloudera quickstart VM (CDH 5.3.0), on which I dont get this hadoop native exception, I get a different ouput HADOOP_CONF: /etc/hadoop/conf HADOOP_HOME: /usr/lib/hadoop ZOOKEEPER_HOME: /usr/lib/zookeeper HBASE_CLASSPATH: /etc/hadoop/conf:/usr/lib/hadoop/*:/usr/lib/hadoop/lib/*:/usr/lib/zookeeper/*:/usr/lib/zookeeper/lib/*: 15/09/05 03:53:54 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library... 15/09/05 03:53:54 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library 15/09/05 03:53:54 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native 15/09/05 03:53:54 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library Native library checking: hadoop: true /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0 zlib: true /lib64/libz.so.1 snappy: true /usr/lib/hadoop/lib/native/libsnappy.so.1 lz4: true revision:99 bzip2: true /lib64/libbz2.so.1 openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared object file: No such file or directory)! Please can anyone suggest what is that I need to do overcome this error. This issue is creating problems while trying to create hbase tables with snappy compression for me. I have already spent 2 days going over this issue without any luck. Any help would be appreciated. regards, Varun
... View more
08-22-2015
02:56 PM
Hi, Thanks for your response. I am based in Glasgow UK.
... View more
08-17-2015
03:43 PM
Sorry if my question appears to be naïve. We are planning to use CDH 5.3.0 or 5.4.0. We want to implement a multi-node cluster. The example multi-node installations that I have seen/read on different blogs/resources have master and slaves on different hosts. However, we are restrained by the number of hosts. We have only 2 powerful hosts ( 32 cores 400+ GB RAM), so if we decide to have master on one and slave on other, we will end up with only one slave. My questions are : 1. Is it possible to have master and slave on the same hosts? 2. Can I have more than one slave node on a single host. 3. Also does one need to pay to use Cloudera Manager or it is open-source like the rest of the components. If you can point me in the direction of some resource which would help me understand above scenarios it would be helpful. Thanks for your help. Regards, Varun
... View more
Labels: