Member since
03-23-2020
2
Posts
0
Kudos Received
0
Solutions
03-23-2020
02:51 PM
I want latest file from hdfs dir.I am using below programm,but it get me all files into dataframe.I want to filter latest file and read only latest file.I am using scala val fname=spark.read.csv("hdfs://ndwns001.ndw.leidos.com/automation_test/oozie/output/Bigdata_Counts/")
... View more
03-23-2020
02:14 PM
i have a dir /automation_test/oozie/output/Bigdata_Counts/.i have multiple files in this dir.I want to read only latest file from this directory
... View more
Labels:
- Labels:
-
Apache Spark