Options
- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Unable to convert a pyspark dataframe to CSV
Labels:
New Contributor
Created on ‎03-01-2023 01:46 AM - edited ‎03-01-2023 01:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i am using CML jupter notebook. while trying to convert a spark dataframe to CSV getting error like MKDIR cannot create a file
Below is the code i have used
spark_df_tx.write('tx.csv')
2 REPLIES 2
New Contributor
Created ‎03-01-2023 01:49 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i have also checked in terminal that i am able to create a directory
Master Collaborator
Created ‎04-20-2023 10:47 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It's working expected. Please find the below code snippet
>>> columns = ["language","users_count"]
>>> data = [("Java", "20000"), ("Python", "100000"), ("Scala", "3000")]
>>> df = spark.createDataFrame(data).toDF(*columns)
>>> df.write.csv("/tmp/test")
>>> df2=spark.read.csv("/tmp/test/*.csv")
d>>> df2.show()
+------+------+
| _c0| _c1|
+------+------+
|Python|100000|
| Scala| 3000|
| Java| 20000|
+------+------+
