Member since
10-22-2015
2
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1835 | 11-09-2015 07:05 PM |
11-09-2015
07:05 PM
5 Kudos
The RDD exists only as long as the spark driver lives. if one or more of the spark worker containers die the portions of the RDDs will be recomputed and cached. persist and cache at the RDD level are actually the same. persist has more options though: the default behavior of persist is StorageLevel.MEMORY_ONLY but you can persist at various different storage levels. http://spark.apache.org/docs/latest/programming-guide.html#rdd-persistence
... View more