Thanks for the reply, In my case it's not a solution because when I'm doing
hadoop checknative -a
I see that the snappy lib is true located at / usr/hdp/184.108.40.206-3485/hadoop/lib/native/libsnappy.so.1.
We have the same problem.
> hadoop checknative -a
snappy: true /usr/hdp/220.127.116.11-3485/hadoop/lib/native/libsnappy.so.1
> rpm -qa snappy
What else can I check?
I have compiled the hadoop again with snappy :
mvn package -Drequire.snappy -Pdist,native,src -DskipTests -Dtar
but got the same exception again...
I have also checked the hadoop-env.sh:
The problem is solve by making the following change in the spark config:
Thanks for the help guys!
just want to add that it seems the spark.driver.extraClassPath is not necessary, at least in my case when I write file in snappy in spark using:
For me adding the line below to spark-defaults.conf helped based on packages installed on my test cluster.