- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
add zip folder and import specific module / class
Created 08-06-2018 04:28 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have a project structure like
foo/foo/apps/lib/math.py
and there are other .py files .
When I do
sc.addPyFile("foo.zip")How Can I import math.py or any class of of this.
Created 08-07-2018 07:06 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think you will have to add the complete path, not only the file name:
-
addPyFile(path)[source] -
Add a .py or .zip dependency for all tasks to be executed on this SparkContext in the future. The
pathpassed can be either a local file, a file in HDFS (or other Hadoop-supported filesystems), or an HTTP, HTTPS or FTP URI.
Created 08-07-2018 07:26 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yes I can add zip file along with path. But there are modules inside compressed folder. How do i Import this.
On python I can do . I have __init__.py in all folders starting from second 'foo'
import foo.apps.lib.math
Created 08-09-2018 05:10 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
in pyspark addPyfile should be working in the same way as in python directly. So maybe you can provide more details on your issue? The addPyfile is wokring? But the import fails? Do you get an error message?