Reply
Highlighted
Expert Contributor
Posts: 82
Registered: ‎02-24-2016

PySpark jupyter integration

I am sucessfully integrated the PySpark and Jupyter When i typed pyspark on terminal, its redirecting me to JuPyter WebUI. But when i am submitting python files using spark-submit on CLI its showing below error

 

[admin@host ~/Desktop]$ spark-submit simpleApp.py
WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/spark) overrides detected (/opt/cloudera/parcels/CDH/lib/spark/).
WARNING: Running spark-class from user-defined location.
jupyter: '/home/admin/Desktop/simpleApp.py' is not a Jupyter command

Help me to fix this issue