Support Questions

Find answers, ask questions, and share your expertise

Graphframes with pyspark

avatar

We are trying to use graphframes package with pyspark. For some reason it doesn't work in our production environment. In our dev environment it works as we can use --packages options and it downloads the libraries from external repository. We cannot use packages option in production as it is not connected to the internet. It works with scala in production.

The default python version is 2.6.6 and hdp version is 2.4.2

pyspark --packages graphframes:graphframes:0.2.0-spark1.6-s_2.10

I copied the all the jars downloaded with --packages option in dev and passed it as parameter to --jars in pyspark command in production. But it doesn't work. The same commands work in dev and spark on my mac.

pyspark --py-files /tmp/thirdpartyjars/graphframes_graphframes-0.2.0-spark1.6-s_2.10.jar,/tmp/thirdpartyjars/com.typesafe.scala-logging_scala-logging-api_2.10-2.1.2.jar,/tmp/thirdpartyjars/com.typesafe.scala-logging_scala-logging-slf4j_2.10-2.1.2.jar,/tmp/thirdpartyjars/org.scala-lang_scala-reflect-2.10.4.jar,/tmp/thirdpartyjars/org.slf4j_slf4j-api-1.7.7.jar --jars /tmp/thirdpartyjars/graphframes_graphframes-0.2.0-spark1.6-s_2.10.jar,/tmp/thirdpartyjars/com.typesafe.scala-logging_scala-logging-api_2.10-2.1.2.jar,/tmp/thirdpartyjars/com.typesafe.scala-logging_scala-logging-api_2.10-2.1.2.jar,/tmp/thirdpartyjars/com.typesafe.scala-logging_scala-logging-slf4j_2.10-2.1.2.jar,/tmp/thirdpartyjars/org.scala-lang_scala-reflect-2.10.4.jar,/tmp/thirdpartyjars/org.slf4j_slf4j-api-1.7.7.jar

Console log Using Python version 2.6.6 (r266:84292, May 22 2015 08:34:51) SparkContext available as sc, HiveContext available as sqlContext. >>> >>> from graphframes import * Traceback (most recent call last): File "<stdin>", line 1, in <module> zipimport.ZipImportError: can't find module 'graphframes'

1 ACCEPTED SOLUTION

avatar

@Deepak Subhramanian

I'd recommend upgrading your python version to 2.7 or higher (preferably Anaconda).

I was able to recreate your error, and it was resolved when I upgraded from 2.6 to python Anaconda 2.7. Let me know if this does the trick for you!

View solution in original post

4 REPLIES 4

avatar

@Deepak Subhramanian

I'd recommend upgrading your python version to 2.7 or higher (preferably Anaconda).

I was able to recreate your error, and it was resolved when I upgraded from 2.6 to python Anaconda 2.7. Let me know if this does the trick for you!

avatar

Thanks Dan. It works in our dev environment which is on python 2.6.6. When I expanded the graphframes jar and ran pyspark from graphframes directory I was getting the "Bad magic number" error which relates to version mismatch. But since it worked in our dev environment which is 2.6 I think it is possible to get it working with 2.6. I am not sure --packages option did something extra to the python packages after downloading to make it working with python 2.6

We are looking at getting Anaconda in our cluster . But it will take time to upgrade as some process is involved in the production environment to make sure the python upgrade doesn't affect ambari and the cluster.

from graphframes import *

Traceback (most recent call last):

File "<stdin>", line 1, in <module>

ImportError: Bad magic number in graphframes/__init__.pyc

https://community.hortonworks.com/questions/9368/ambari-server-install-requires-python26.html

avatar

@Deepak Subhramanian

I got this to work in python 2.6 by following these steps:

1.) Download the graphframes-0.2.0-spark1.6-s_2.10.zip file from here

2.) Download the graphframes-0.2.0-spark1.6-s_2.10.jar file from here

3.) Unzip graphframes-0.2.0-spark1.6-s_2.10.zip

4.) Navigate to the python directory:

cd ./graphframes-0.2.0-spark1.6-s_2.10/python

5.) Zip up the contents contained within this directory:

zip mypyfiles.zip * -r

6. Launch pyspark:

./bin/pyspark --py-files mypyfiles.zip --jars graphframes-0.2.0-spark1.6-s_2.10.jar
 

Give that a shot - let me know how it goes.

avatar

@Dan Zaratsian

That worked. Thanks a lot.