Created 08-06-2018 04:28 PM
I have a project structure like
foo/foo/apps/lib/math.py
and there are other .py files .
When I do
sc.addPyFile("foo.zip")
How Can I import math.py or any class of of this.
Created 08-07-2018 07:06 AM
I think you will have to add the complete path, not only the file name:
addPyFile
(path)[source]Add a .py or .zip dependency for all tasks to be executed on this
SparkContext in the future. The path
passed can be either a local
file, a file in HDFS (or other Hadoop-supported filesystems), or an
HTTP, HTTPS or FTP URI.
Created 08-07-2018 07:26 AM
yes I can add zip file along with path. But there are modules inside compressed folder. How do i Import this.
On python I can do . I have __init__.py in all folders starting from second 'foo'
import foo.apps.lib.math
Created 08-09-2018 05:10 AM
in pyspark addPyfile should be working in the same way as in python directly. So maybe you can provide more details on your issue? The addPyfile is wokring? But the import fails? Do you get an error message?