Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

HDFS underreplicated blocks

avatar
Contributor

Hello,

 

 I have installed a new cluster with CM 5.11 and Hadoop 2.6.0-cdh5.11.0. I have 1 Name Node with all other services running on it, and 2 Datanodes. HDFS Replication factor is set to 2. 

 

I get the following  HDFS error: Bad : 666 under replicated blocks in the cluster. 675 total blocks in the cluster. Percentage under replicated blocks: 98.67%. Critical threshold: 40.00%.

 

I also tried setting the replication factor to 1 to test it out and restarted the cluster and agents, but the issue remains unchanged. There are a few threads about this issue already but I haven't found a solution yet. Do you have any idea? Thanks in advance!

1 ACCEPTED SOLUTION

avatar
Contributor

I fixed the issue.

 

The problem was that after setting the replication factor in Cloudera Manager files which were already created still preserve the replication factor they were created with (in my case 3). I had to recursively set the factor to 2 on the command line:

 

hdfs dfs -setrep -R 2 /*

 

Maybe it will help someone.

View solution in original post

2 REPLIES 2

avatar
Contributor

I fixed the issue.

 

The problem was that after setting the replication factor in Cloudera Manager files which were already created still preserve the replication factor they were created with (in my case 3). I had to recursively set the factor to 2 on the command line:

 

hdfs dfs -setrep -R 2 /*

 

Maybe it will help someone.

avatar
Visitor

Very helpful .... Solved my problem! Thank you so much