Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hide "Connecting to namenode via..." message from stdout when using hdfs fsck in a for loop

avatar
Expert Contributor

Hello ,

I usually use for loops to get info of some folders.

For example: I had to find which folder inside /user/bigdata was consuming high number of blocks due to small files.

So i used this:

for i in $(hadoop fs -ls /user/bigdata/ | grep drwx | awk '{print $8}'); do echo "$i $(hdfs fsck $i -blocks -files -locations | grep BP- | wc -l)" ; done

Getting a lot of "Connecting to namenode via http://<hostname>:50070/fsck?ugi=hdfs&blocks=1&files=1&locations=1&path=%2Fuser%2Fbigdata%2F.<directory>" messages.

Does exist any way of hide this message? I currently have to redirect the output to a file and then use cat to read a clear infomation.

Thank you in advance.

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Well finally I solved this.

The "Connecting to namenode via http://<hostname>:50070/fsck?ugi=hdfs█s=1&files=1&locations=1&path=%2Fuser%2Fbigdata%2F.<directory>" is the stderr output of the command so redirecting stderr to /dev/null does the work :).

for i in $(hadoop fs -ls /user/bigdata/ | grep drwx | awk '{print $8}'); do echo "$i $(hdfs fsck $i -blocks -files -locations 2> /dev/null | grep BP- | wc -l)" ; done

View solution in original post

1 REPLY 1

avatar
Expert Contributor

Well finally I solved this.

The "Connecting to namenode via http://<hostname>:50070/fsck?ugi=hdfs█s=1&files=1&locations=1&path=%2Fuser%2Fbigdata%2F.<directory>" is the stderr output of the command so redirecting stderr to /dev/null does the work :).

for i in $(hadoop fs -ls /user/bigdata/ | grep drwx | awk '{print $8}'); do echo "$i $(hdfs fsck $i -blocks -files -locations 2> /dev/null | grep BP- | wc -l)" ; done