Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Recursively list all HDFS directories with WEBHDFS?

avatar
Contributor

I know about https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html#Iteratively_List..., but not quite getting what I want. The problem is, for downtimes, hadoop fs -ls -R takes way too long. I was hoping webhdfs could produce the same.

1 REPLY 1

avatar
Expert Contributor

LISTSTATUS_BATCH is not for recursive listing, but for "iterative" listing. It means that WebHDFS will return the list (of files in the given directory) in (dfs.ls.limit sized) parts. It won't list the files of the subdirectories.

Unfortunately hdfs dfs -ls -R is most likely your fastest option, if you really would like to print your whole filesystem.