Created 06-04-2017 09:28 AM
when I study HDFS,I encounter one question,I want to make hadoop read the confighration file again,because I have changed the file, when stay hadoop open。and I don't want to reboot the hadoop service!
thank you very much!
Created 06-06-2017 09:57 AM
Hi @Xiong Duan - for most cases, in order to refresh the configurations in the running service, you have to restart the service and there isn't really a way around it.
However, if you have High Availability enabled (e.g. NameNode HA), there is a way you can refresh configs without downtime, but you must be very careful.
This may not be safe for all configuration properties - which properties do you want to refresh?
Created 06-05-2017 10:48 PM
When you say "I want to make hadoop read the configuration file again..", do you mean Namenode service or any other service? Could you please clarify which service you want to refresh. Thanks.
Created 06-06-2017 09:57 AM
Hi @Xiong Duan - for most cases, in order to refresh the configurations in the running service, you have to restart the service and there isn't really a way around it.
However, if you have High Availability enabled (e.g. NameNode HA), there is a way you can refresh configs without downtime, but you must be very careful.
This may not be safe for all configuration properties - which properties do you want to refresh?
Created 06-06-2017 12:53 PM
Thank you very much,I just want to change the replications of files that have upload to HDFS.So yesterdy,I found that one command can realize that.So thank you very much!
Created 06-06-2017 02:10 PM
You're very welcome! Yes, for this the setrep command can change the replication factor for existing files and there is no need to change the value globally. For others' reference, the command is:
hdfs dfs -setrep [-R] [-w] <numReplicas> <path>