Created 05-25-2018 11:00 AM
I am moving to pyspark & zeppelin . I created a two notebook my_settings.py and main.py .
But when I do following in main.py
%pyspark import my_settings ImportError: No module named my_settings
I get import error saying , no modules found.
This works fine on my local server.
I wonder if there's any env setting for this to work.
Created 05-25-2018 03:21 PM
@bharat sharma Notebooks are not python modules. If you are trying to import a notebook as if it was a python module AFAIK that won't work.
If you are trying to import modules to pyspark application you have different ways to do this. One way is to copy the python file to hdfs and use the following:
%pyspark sc.addPyFile("/user/zeppelin/my_settings.py") import my_settings
HTH
*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
Created 05-25-2018 03:21 PM
@bharat sharma Notebooks are not python modules. If you are trying to import a notebook as if it was a python module AFAIK that won't work.
If you are trying to import modules to pyspark application you have different ways to do this. One way is to copy the python file to hdfs and use the following:
%pyspark sc.addPyFile("/user/zeppelin/my_settings.py") import my_settings
HTH
*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
Created 05-28-2018 02:18 PM
@bharat sharma If the above answer helped addressed your question, please take a moment to login and click the "accept" link on the answer.