Support Questions
Find answers, ask questions, and share your expertise

Who agreed with this solution

Re: Running PySpark with Conda Env issue

Cloudera Employee

Hello @PabloMO ,


As the Spoiler Error pointed by you,the versions are not matching.

You can check it by running "which python"


You can override the below two configs in /opt/cloudera/parcels/CDH-<version>/lib/spark/conf/

and restart pyspark.

export PYSPARK_PYTHON=<same version of python>
export PYSPARK_DRIVER_PYTHON=<same version of python>


 Hope it helps.


Thanks & Regards,


View solution in original post

Who agreed with this solution