Support Questions

Find answers, ask questions, and share your expertise

Can we do maintenance of nodes(linux patches) in hadoop cluster without impacting jobs running in the cluster?

Explorer

For example, There are 500 nodes in the Hadoop cluster, Linux team want to make OS patches/upgrades in batches, how can we make(Hadoop Admins) sure data availability and no impacts on jobs without decommissioning? Because decommissioning of heavy large nodes(say 90TB) nodes take forever, is there any way we can do this without decommissioning the nodes?

Say, we have rack awareness set, each rack has 6 nodes.

5 REPLIES 5

Explorer

Explorer

Mentor

@Deepak Vivaramneni

This should answer your worries.

https://community.hortonworks.com/questions/4940/hdp-os-upgradepatching-best-practices.html#

Be cautious always with production cluster you need to test and document in DEV, UAT or pre-PROD never said you were not warned 🙂
Happy Hadooping !!

Explorer

Mentor

@Deepak Vivaramneni

If you found this answer addressed your question, please take a moment to log in and click the "accept" link on the answer.