Member since
07-05-2016
20
Posts
4
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1186 | 06-07-2017 06:28 PM |
09-29-2017
08:32 AM
2 Kudos
@vsubramanian hdfs dfs -ls /user/hdfs The above can direct show hidden files, you can directly see the below, for example drwx------ - hdfs hdfs 0 2017-07-13 02:00 /user/hdfs/.Trash
drwxr-xr-x - hdfs hdfs 0 2017-04-06 14:21 /user/hdfs/.hiveJars
drwxr-xr-x - hdfs hdfs 0 2017-06-29 09:12 /user/hdfs/.sparkStaging
drwxr-xr-x - hdfs hdfs 0 2017-04-24 15:54 /user/hdfs/SSP00805
where you can see the file start with a dot.
... View more
06-07-2017
10:37 PM
1 Kudo
@vsubramanian, I'm afraid this is currently not possible. The only WebHDFS REST api that implements "recursive" query parameter is delete. So if you use recursive with delete, that would work. The ListStatus API here has not implemented any recursive logic. For your use-case, you'll manually need to call curl with op=LISTSTATUS multiple times for each directory after parsing the output. Similar to what "ls -R" Shell command does currently. Hope this helps.
... View more
03-28-2017
04:04 AM
1 Kudo
@vsubramanian
The root cause of the failure is OOM on NameNode java.lang.OutOfMemoryError: Java heap space at java.lang.String. . So i guess you will need to increase the -Xmx (Java Heap Size) of your NameNode. NameNode heap size recommendations are mentioned in the following link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_command-line-installation/content/ref-80953924-1cbf-4655-9953-1e744290a6c3.1.html
... View more