Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hive : "native snappy library not available: this version of libhadoop was built without snappy support"

avatar
New Contributor

Hive, while using either TEZ or MR as execution engine, throws "native snappy library not available: this version of libhadoop was built without snappy support" error intermittently . Meaning that sometimes the queries go through and sometimes they fail . Please advise on how to investigate and fix this .

3 REPLIES 3

avatar
Master Mentor

@Amit Ashish

Which version of HDP are you using? Are you sure that all the host have the following packages installed? (it might be possible that when it is failing intermittently that time it might be executing the queries on the hosts which does not have these packages installed).

Can you please install the following packages as mentioned in the DOC: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_command-line-installation/content/instal...

Install Snappy on all the nodes in your cluster. At each node:

  • For RHEL/CentOS/Oracle Linux:

    yum install snappy snappy-devel

  • For SLES:

    zypper install snappy snappy-devel

  • For Ubuntu/Debian:

    apt-get install libsnappy1 libsnappy-dev

avatar
Master Mentor

@Amit Ashish

You can also use the "hadoop checknative" command on all the clusert hosts to find out if the snappy or other native libraries are installed properly on them or not?

Example:

# hadoop checknative
18/04/24 00:43:37 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
18/04/24 00:43:37 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop:  true /usr/hdp/2.6.4.0-91/hadoop/lib/native/libhadoop.so.1.0.0
zlib:    true /lib64/libz.so.1
snappy:  true /usr/hdp/2.6.4.0-91/hadoop/lib/native/libsnappy.so.1
lz4:     true revision:99
bzip2:   true /lib64/libbz2.so.1

.

avatar
New Contributor

@Jay Kumar SenSharma Thanks for your replies . We had incorrect configuration since a recent upgrade to HDP 2.6 from HDP 2.4 . We were still pointing to old Hadoop and Hive home directories in a config file . The issue seems to have been fixed now after pointing to current Hadoop and Hive directories.