I have a project structure like
and there are other .py files .
When I do
How Can I import math.py or any class of of this.
I think you will have to add the complete path, not only the file name:
Add a .py or .zip dependency for all tasks to be executed on this
SparkContext in the future. The
path passed can be either a local
file, a file in HDFS (or other Hadoop-supported filesystems), or an
HTTP, HTTPS or FTP URI.
yes I can add zip file along with path. But there are modules inside compressed folder. How do i Import this.
On python I can do . I have __init__.py in all folders starting from second 'foo'
in pyspark addPyfile should be working in the same way as in python directly. So maybe you can provide more details on your issue? The addPyfile is wokring? But the import fails? Do you get an error message?