Support Questions
Find answers, ask questions, and share your expertise

How do I write Spark job output to NFS Share instead of writing it to HDFS

Explorer

Hi All,

I need to write Spark job output file to NFS mount point from spark2 shell. can you please let me know if there is any way to do it by defining absolute path in Spark2 shell.

 

Thanks,
CS

2 REPLIES 2

Have you tried a path like this -> file:///path/to/mounted/nfs

Mentor

@Chittu 

Can you share your code example? The should be an option to specify mode='overwrite' when saving a DataFrame:

 

myDataFrame.save(path='"/output/folder/path"', source='parquet', mode='overwrite') 

Please revert

; ;