Member since
11-22-2017
3
Posts
0
Kudos Received
0
Solutions
11-22-2017
09:02 AM
Thanks! spark-submit script fixed the problem!
... View more
11-22-2017
05:51 AM
Okay, So how can I increase the overhead in Jupyter Notebook? I am not using spark-submit for this job. And how could I find out, what are current overhead settings? Thanks!
... View more
11-22-2017
05:01 AM
Hey, I am having the same issues. Spark 1.6 Cloudera Express 5.7.1 ExecutorLostFailure (executor 60 exited caused by one of the running tasks)
Reason: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used.
Consider boosting spark.yarn.executor.memoryOverhead. I see your solution but cannot find where that is in CM. Can you please point me where that option is in Cloudera Manager UI? Thanks, Marcin
... View more