Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark Application on YARN .

Solved Go to solution
Highlighted

Spark Application on YARN .

Contributor

Hi All,

When executing the spark application on YARN cluster can I access the local file system (Underlying OS FS).

Though YARN is pointing to HDFS .

Thanks ,

Param.

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Spark Application on YARN .

@Param NC

Yes, you can access the local file. Here is the sample:-

spark-shell --master yarn-client

scala> sc.textFile("file:///etc/passwd").count()

res0: Long = 40
3 REPLIES 3

Re: Spark Application on YARN .

@Param NC

Yes, you can access the local file. Here is the sample:-

spark-shell --master yarn-client

scala> sc.textFile("file:///etc/passwd").count()

res0: Long = 40

Re: Spark Application on YARN .

Contributor

@Sandeep Nemuri

Thanks its worked ..I tried this already but forgot to create the file on each node ,now its fine.

And I just got one more question here :

If I run in spark app YARN mode I can set the memory parameter through sparkconfiguration using spark.yarn.driver.memoryOverhead properties ,

Is something similar available for the standalone and local mode ?

Thanks in advance ,

Param.

Re: Spark Application on YARN .

@Param NC , Please close this thread by accepting the answer and consider asking new question.