Support Questions

Find answers, ask questions, and share your expertise

pyspark die after few hours


I ran pyspark code the first time it was fine, the second time it dies and show this on every single cell of my zeppelin notebook, and also other notebook that I am running with pyspark, I have to t restart the interpreter order to fix this.


Traceback (most recent call last): File "/var/folders/zh/dvdnf74d1t9cq78hjjm3xft80000gn/T/", line 343, in sc.setJobGroup(jobGroup, "Zeppelin") File "/Users/titusfong/spark/python/pyspark/", line 902, in setJobGroup self._jsc.setJobGroup(groupId, description, interruptOnCancel) AttributeError: 'NoneType' object has no attribute 'setJobGroup'




Can you please upload full error message which is above Traceback?