Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

this version of libhadoop was built without snappy support.

Solved Go to solution

Re: this version of libhadoop was built without snappy support.

Contributor

@Neeraj Sabharwal

Thanks for the reply, In my case it's not a solution because when I'm doing

hadoop checknative -a

I see that the snappy lib is true located at / usr/hdp/2.3.4.0-3485/hadoop/lib/native/libsnappy.so.1.

Re: this version of libhadoop was built without snappy support.

New Contributor

We have the same problem.

> hadoop checknative -a

snappy: true /usr/hdp/2.3.4.0-3485/hadoop/lib/native/libsnappy.so.1

> rpm -qa snappy

snappy-1.1.0-3.el7.x86_64

What else can I check?

Re: this version of libhadoop was built without snappy support.

Mentor

Please confirm that you have the following property set correctly in hadoop-env.sh

Re: this version of libhadoop was built without snappy support.

Contributor

@Artem Ervits which property?

Re: this version of libhadoop was built without snappy support.

Mentor

Re: this version of libhadoop was built without snappy support.

New Contributor

I have compiled the hadoop again with snappy :

svn checkout http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.5.0

mvn package -Drequire.snappy -Pdist,native,src -DskipTests -Dtar

but got the same exception again...

I have also checked the hadoop-env.sh:

export JAVA_LIBRARY_PATH=${JAVA_LIBRARY_PATH}

Re: this version of libhadoop was built without snappy support.

Contributor

The problem is solve by making the following change in the spark config:

2840-cparkconfig.jpg

Thanks for the help guys!

Highlighted

Re: this version of libhadoop was built without snappy support.

Expert Contributor

just want to add that it seems the spark.driver.extraClassPath is not necessary, at least in my case when I write file in snappy in spark using:

rdd.saveAsTextFile(path, SnappyCodec.class)

Re: this version of libhadoop was built without snappy support.

Explorer

For me adding the line below to spark-defaults.conf helped based on packages installed on my test cluster.

spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native/:/usr/hdp/current/share/lzo/0.6.0/lib/native/Linux-amd64-64/

Don't have an account?
Coming from Hortonworks? Activate your account here