Support Questions
Find answers, ask questions, and share your expertise

PySpark jupyter integration

PySpark jupyter integration

Expert Contributor

I am sucessfully integrated the PySpark and Jupyter When i typed pyspark on terminal, its redirecting me to JuPyter WebUI. But when i am submitting python files using spark-submit on CLI its showing below error


[admin@host ~/Desktop]$ spark-submit
WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/spark) overrides detected (/opt/cloudera/parcels/CDH/lib/spark/).
WARNING: Running spark-class from user-defined location.
jupyter: '/home/admin/Desktop/' is not a Jupyter command

Help me to fix this issue