Support Questions

Find answers, ask questions, and share your expertise

Compression not working due to tmp folder previliges

New Contributor

Hi Everyone,

I have a problem whenever i am trying to store my data in a compressed format with pig, Sqoop, or Spark. I know the problem is with mounting our /tmp folder to nonexec and this causes for instance snappy to give me this error:

java.lang.IllegalArgumentException: java.lang.UnsatisfiedLinkError: /tmp/ /tmp/ failed to map segment from shared object: Operation not permitted

The solutions that i found in the internet is that either mount the /tmp folder to exec which is not an option for me as the sysadmin won't allow it due to security concerns.The other option is to change the java opts execution path to some other paths instead of /tmp.

I have tried the following approach, but it didn't solve the problem.
add these lines to and sqoop-env

export HADOOP_OPTS="$HADOOP_OPTS -Dorg.xerial.snappy.tempdir=/newpath"

I would appreciate if you guys have any other solutions that could solve the issue.



New Contributor

Hello Saad ,

Did you find a solution ?