Support Questions

Find answers, ask questions, and share your expertise

Recursively list all HDFS directories with WEBHDFS?

Explorer

I know about https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html#Iteratively_List..., but not quite getting what I want. The problem is, for downtimes, hadoop fs -ls -R takes way too long. I was hoping webhdfs could produce the same.

1 REPLY 1

Expert Contributor

LISTSTATUS_BATCH is not for recursive listing, but for "iterative" listing. It means that WebHDFS will return the list (of files in the given directory) in (dfs.ls.limit sized) parts. It won't list the files of the subdirectories.

Unfortunately hdfs dfs -ls -R is most likely your fastest option, if you really would like to print your whole filesystem.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.