Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hadoop Compression: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

Highlighted

Hadoop Compression: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

New Contributor

Hi all,

When I'm running Apache Kylin on Hadoop, I met the following error related to Hadoop MapReduce:

2019-03-20 08:06:00,193 ERROR [main] org.apache.kylin.engine.mr.KylinMapper: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192)
at org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1552)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
at org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
at org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)

I guess the reason is that Hadoop cannot find the libsnappy.so* native library. I have searched solution online and have already added the following properties in corresponding xml files and restarted services:

# For HDFS core-site.xml
<property>
<name>io.compression.codecs</name>
<value>
org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
</value>
</property>
# For MapReduce mapred-site.xml
<property>
<name>mapreduce.admin.user.env</name>
<value>LD_LIBRARY_PATH=/usr/hdp/${hdp.version}/hadoop/lib/native</value>
</property>
<property>
<name>mapreduce.map.output.compress</name>
<value>true</value>
</property>
<property>
<name>mapred.map.output.compress.codec</name>
<value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
<property>
<name>mapreduce.output.fileoutputformat.compress</name>
<value>true</value>
</property>
<property>
<name>mapreduce.output.fileoutputformat.compress.codec</name>
<value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>

However, it didn't work. So I dig into the log of YARN. I found that in the launch_container.sh part, it has the following command: However, it didn't work. So I dig into the log of YARN. I found that in the launch_container.sh part, it has the following command:

export PWD="/hadoop/yarn/local/usercache/root/appcache/application_1553049994285_0013/container_e04_1553049994285_0013_01_000005"
# ...omit other commands
export LD_LIBRARY_PATH="$PWD"

I think this command is wrong since instead of $PWD, the true path to libsnappy.so* is.

/usr/hdp/${hdp.version}/hadoop/lib/native

Also I have already set LD_LIBRARY_PATH to point to the true path. Why does yarn still use $PWD?

Besides, I added two log message which is shown as follows. This ensures that the env LD_LIBRARY_PATH is really set mistakenly. So how can I solve this problem?

2019-03-20 08:06:00,044 INFO [main] org.apache.kylin.engine.mr.KylinMapper: linyanwen[from map]: /hadoop/yarn/local/usercache/root/appcache/application_1553049994285_0039/container_e04_1553049994285_0039_01_000005


Thanks!