Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

pyspark die after few hours


I ran pyspark code the first time it was fine, the second time it dies and show this on every single cell of my zeppelin notebook, and also other notebook that I am running with pyspark, I have to t restart the interpreter order to fix this.


Traceback (most recent call last): File "/var/folders/zh/dvdnf74d1t9cq78hjjm3xft80000gn/T/", line 343, in sc.setJobGroup(jobGroup, "Zeppelin") File "/Users/titusfong/spark/python/pyspark/", line 902, in setJobGroup self._jsc.setJobGroup(groupId, description, interruptOnCancel) AttributeError: 'NoneType' object has no attribute 'setJobGroup'


Rising Star


Can you please upload full error message which is above Traceback?

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.