Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How do I write Spark job output to NFS Share instead of writing it to HDFS

avatar
Explorer

Hi All,

I need to write Spark job output file to NFS mount point from spark2 shell. can you please let me know if there is any way to do it by defining absolute path in Spark2 shell.

 

Thanks,
CS

2 REPLIES 2

avatar
Have you tried a path like this -> file:///path/to/mounted/nfs

avatar
Master Mentor

@Chittu 

Can you share your code example? The should be an option to specify mode='overwrite' when saving a DataFrame:

 

myDataFrame.save(path='"/output/folder/path"', source='parquet', mode='overwrite') 

Please revert