We're on CDM 5.14.2 with CDH 5.13.3
Both Spark 1.6 and Spark 2.3.3 are installed (some apps are still using Spark 1.6, can't remove it yet)
Now when I'm starting pyspark with config file for Spark2, it still runs pyspark with Spark 1.6
pyspark --properties-file /etc/spark2/conf/spark-defaults.conf
it shows after the ASCII Spark logo: version 1.6.0
using verbose mode it shows the paths are pointing to Spark 2
Why is pyspark still referring to Spark 1.6 ?
How can I force it to use spark 2.3.3 ?