- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
spark job failure with no space left on device
- Labels:
-
Apache Spark
-
Apache YARN
Created ‎10-03-2018 06:09 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am running the spark job which is leading to failure of job with the error no space left on the device however there is enough space available in the device . i have checked with df -h and df -i command and no issue with the space I see .
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 656 in stage 11.0 failed 4 times, most recent failure: Lost task 656.3 in stage 11.0 (TID 680, I<workernode>): java.io.IOException: No space left on device
Created ‎10-03-2018 06:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Spark keeps intermediate files in /tmp, where it likely ran out of space. You can either adjust spark.local.dir or set this at submission time, to a different directory with more space. Try the same job while adding in this during spark-submit; --conf "spark.local.dir=/directory/with/space"
If that works well, you can change this permanently by adding this property to the custom spark defaults in ambari; spark.local.dir=/directory/with/space
See also: https://spark.apache.org/docs/latest/configuration.html#application-properties
