Created 01-22-2024 10:08 PM
All OS disks on the cluster server are broken,
Therefore, I need to install a new OS and set up hdfs
However, all data disks are considered available
name node has "edits" file and "fsimage" file,
There are "blk" and "blk meta" files in the data node.
Is it possible to restore hdfs data based on block data in this situation?
Created 02-06-2024 06:01 PM
It's a self-help answer.
The solution I solved is as follows.
Note, I don't know if this is the right way.
1. Reinstall hdfs
2. Set up a different data directory than before
3. Verify that it runs normally
4. Change the data directory to the old path
5. hdfs restart
6. hdfs fall into safe mode, but data that existed in hdfs could be queried
7. Download the data to the local space using the 'copyToLocal' command
8. Afterwards, restore by transferring to the newly installed data directory
Created 01-23-2024 02:20 AM
@Remme, Welcome to our community! To help you get the best possible answer, I have tagged in our HDFS experts @rki_ @willx @Asok @PrathapKumar who may be able to assist you further.
Please feel free to provide any additional information or details about your query, and we hope that you will find a satisfactory solution to your question.
Regards,
Vidya Sargur,Created 02-06-2024 06:01 PM
It's a self-help answer.
The solution I solved is as follows.
Note, I don't know if this is the right way.
1. Reinstall hdfs
2. Set up a different data directory than before
3. Verify that it runs normally
4. Change the data directory to the old path
5. hdfs restart
6. hdfs fall into safe mode, but data that existed in hdfs could be queried
7. Download the data to the local space using the 'copyToLocal' command
8. Afterwards, restore by transferring to the newly installed data directory
Created 12-12-2024 10:02 AM
@Remme Though the procedure you followed might have helped you, with a larger cluster with TBs of Data, this is not a viable option. In that case, would advise working with Cloudera Support.