11-28-2017 06:02 PM
I have been trying to set environment variable in Spark. However there seems to be problems.
I tried to use HDFS/YARN from CDH 5.12, and a standalone Spark (v2.2.0) and run together with Crail (https://github.com/zrlio/crail). However, there is error in the YARN logs saying that Crail's library path is not included in java.library.path.
17/11/27 10:57:50 INFO ibm.crail: crail.storage.rdma.type passive
17/11/27 10:57:50 INFO ibm.disni: creating RdmaProvider of type 'nat'
Exception in thread "dag-scheduler-event-loop" java.lang.UnsatisfiedLinkError: no disni in java.library.path
I found in a post from Crail's user group that it can be fixed by setting the following variable:
Here is the post:
Can you please guide where I should set the environment variable inside CDH?
I tried to set the environment variable inside ~/.bashrc and spark-env.sh. However, it didn't work, because it seems CDH will reset all enviroment variables when starting services.
I also tried setting the environment variable in all the places I can find inside CDH, including the configuration of Environments in Cloudera Management Service, YARN, and HDFS. But the problem is still not solved.
Solved! Go to Solution.
12-08-2017 06:41 AM