Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark Application on YARN .

avatar
Rising Star

Hi All,

When executing the spark application on YARN cluster can I access the local file system (Underlying OS FS).

Though YARN is pointing to HDFS .

Thanks ,

Param.

1 ACCEPTED SOLUTION

avatar
@Param NC

Yes, you can access the local file. Here is the sample:-

spark-shell --master yarn-client

scala> sc.textFile("file:///etc/passwd").count()

res0: Long = 40

View solution in original post

3 REPLIES 3

avatar
@Param NC

Yes, you can access the local file. Here is the sample:-

spark-shell --master yarn-client

scala> sc.textFile("file:///etc/passwd").count()

res0: Long = 40

avatar
Rising Star

@Sandeep Nemuri

Thanks its worked ..I tried this already but forgot to create the file on each node ,now its fine.

And I just got one more question here :

If I run in spark app YARN mode I can set the memory parameter through sparkconfiguration using spark.yarn.driver.memoryOverhead properties ,

Is something similar available for the standalone and local mode ?

Thanks in advance ,

Param.

avatar

@Param NC , Please close this thread by accepting the answer and consider asking new question.