Support Questions

Find answers, ask questions, and share your expertise

Import error on pyspark and Zeppelin for local module

avatar
Explorer

I am moving to pyspark & zeppelin . I created a two notebook my_settings.py and main.py .

But when I do following in main.py

%pyspark 
import my_settings 

ImportError: No module named my_settings

I get import error saying , no modules found.

This works fine on my local server.

I wonder if there's any env setting for this to work.

1 ACCEPTED SOLUTION

avatar

@bharat sharma Notebooks are not python modules. If you are trying to import a notebook as if it was a python module AFAIK that won't work.

If you are trying to import modules to pyspark application you have different ways to do this. One way is to copy the python file to hdfs and use the following:

%pyspark
sc.addPyFile("/user/zeppelin/my_settings.py")
import my_settings 

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

View solution in original post

2 REPLIES 2

avatar

@bharat sharma Notebooks are not python modules. If you are trying to import a notebook as if it was a python module AFAIK that won't work.

If you are trying to import modules to pyspark application you have different ways to do this. One way is to copy the python file to hdfs and use the following:

%pyspark
sc.addPyFile("/user/zeppelin/my_settings.py")
import my_settings 

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

avatar

@bharat sharma If the above answer helped addressed your question, please take a moment to login and click the "accept" link on the answer.