Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Storage requirement for Yarn only nodes for Spark jobs .

Storage requirement for Yarn only nodes for Spark jobs .

Explorer

If a user is planning to have Yarn only nodes for Spark jobs along with some data nodes(which are both Data and Yarn), how much storage will these Yarn only nodes need. For instance the Spark Memory is going to write as overfill to disk as Memory mapped files, will that write to OS directory swap or is that yarn-local dir? Any recommendation for Spark Memory overfill also appreciated.

Don't have an account?
Coming from Hortonworks? Activate your account here