Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

PySpark 1.5

PySpark 1.5

New Contributor

I am using PySpark 1.5 on CDH 5.5.2. I am using Python Jupyter to access Spark. I have it installed on only my Driver Node. I am getting an error when I try to execute jobs now, that I previously didn't get before the move from CDH 5.5 -> CDH 5.5.2. 

 

"Error from python worker:

/usr/local/bin/python3: No module named 'zlib' "