I just recently setup my cluster and when I attempt to run a spark job using python I am getting errors about PYTHONHASHSEED needing to be set. Using the Spark config UI in CM I added the following two lines to my spark-env.sh. I also added them to my ~/.bashrc on each node.
Unfortunately, I am still getting errors. Any advice?