Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Compression not working due to tmp folder previliges

Highlighted

Compression not working due to tmp folder previliges

New Contributor

Hi Everyone,

I have a problem whenever i am trying to store my data in a compressed format with pig, Sqoop, or Spark. I know the problem is with mounting our /tmp folder to nonexec and this causes for instance snappy to give me this error:

java.lang.IllegalArgumentException: java.lang.UnsatisfiedLinkError: /tmp/snappy-1.1.2-fe4e30d0-e4a5-4b1a-ae31-fd1861117288-libsnappyjava.so: /tmp/snappy-1.1.2-fe4e30d0-e4a5-4b1a-ae31-fd1861117288-libsnappyjava.so: failed to map segment from shared object: Operation not permitted

The solutions that i found in the internet is that either mount the /tmp folder to exec which is not an option for me as the sysadmin won't allow it due to security concerns.The other option is to change the java opts execution path to some other paths instead of /tmp.

I have tried the following approach, but it didn't solve the problem.
add these lines to hadoop-env.sh and sqoop-env

export HADOOP_OPTS="$HADOOP_OPTS -Dorg.xerial.snappy.tempdir=/newpath"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.io.tmpdir=/newpath"

I would appreciate if you guys have any other solutions that could solve the issue.

Thanks

1 REPLY 1

Re: Compression not working due to tmp folder previliges

New Contributor

Hello Saad ,

Did you find a solution ?