Support Questions
Find answers, ask questions, and share your expertise

Problem running pyspark spark-submit in a virtualenv?


Problem running pyspark spark-submit in a virtualenv?

Rising Star

Appear to be having issues running a pyspark script in a virtualenv. Have a fairly simple script

import findsparkfrom pyspark import SparkConf
from pyspark import SparkContext
from os import environ

environ["HADOOP_CONF_DIR"] = "/etc/hadoop/"findspark.init("/usr/hdp/current/spark2-client")conf = SparkConf()conf.setMaster('yarn-client')conf.set("spark.hadoop.yarn.resourcemanager.address", "")conf.setAppName('spark-yarn-demo001')sc = SparkContext(conf=conf)

def inside(p):    x, y = random.random(), random.random()
    return x*x + y*y < 1NUM_SAMPLES=1000count = sc.parallelize(xrange(0, NUM_SAMPLES)) \

print "Pi is roughly %f" % (4.0 * count / NUM_SAMPLES)

(I know the spark.hadoop.yarn.resourcemanager port is usually 8032, but on my HDP 3.1 cluster the default configs are at 8050) and trying to run via spark-submit.

(spark_demo_venv) [hdfs@HW04 tmp]$ which python~/tmp/spark_demo_venv/bin/python(spark_demo_venv) [hdfs@HW04 tmp]$ /usr/hdp/current/spark2-client/bin/spark-submit /home/hdfs/tmp/ 

Seeing errors like...

19/08/27 14:31:57 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on HW03.ucera.local:35043 (size: 4.0 KB, free: 366.3 MB)
19/08/27 14:31:58 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, HW03.ucera.local, executor 1): Cannot run program "/home/hdfs/tmp/spark_demo_venv/bin/python": error=13, Permission denied
    at java.lang.ProcessBuilder.start(    at org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:186)

The fact that the virtualenv python binary is what spark appears to have trouble finding makes me think that there is something wrong with trying to run pyspark in a virtualenv (can't test without virtualenv, since don't have permissions to pip install on this cluster node). Any other debugging suggestions or fixes for figuring this out?