Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

spark job failure with no space left on device

avatar

I am running the spark job which is leading to failure of job with the error no space left on the device however there is enough space available in the device . i have checked with df -h and df -i command and no issue with the space I see .

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 656 in stage 11.0 failed 4 times, most recent failure: Lost task 656.3 in stage 11.0 (TID 680, I<workernode>): java.io.IOException: No space left on device

1 REPLY 1

avatar

Hi @Anurag Mishra

Spark keeps intermediate files in /tmp, where it likely ran out of space. You can either adjust spark.local.dir or set this at submission time, to a different directory with more space. Try the same job while adding in this during spark-submit; --conf "spark.local.dir=/directory/with/space"
If that works well, you can change this permanently by adding this property to the custom spark defaults in ambari; spark.local.dir=/directory/with/space

See also: https://spark.apache.org/docs/latest/configuration.html#application-properties