Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

How do I write Spark job output to NFS Share instead of writing it to HDFS

Explorer

Hi All,

I need to write Spark job output file to NFS mount point from spark2 shell. can you please let me know if there is any way to do it by defining absolute path in Spark2 shell.

 

Thanks,
CS

2 REPLIES 2

Have you tried a path like this -> file:///path/to/mounted/nfs

Mentor

@Chittu 

Can you share your code example? The should be an option to specify mode='overwrite' when saving a DataFrame:

 

myDataFrame.save(path='"/output/folder/path"', source='parquet', mode='overwrite') 

Please revert

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.