- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
YARN used 2000% of memory
- Labels:
-
Apache YARN
Created on ‎02-16-2016 11:24 PM - edited ‎08-19-2019 12:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
We have a cluster of 5 nodes currently running HDP2.0. Recently we observed that YARN is using 2000% of the memory.
Currently we allocated 2GB for yarn memory and the metrics showed 40GB was used for our current job. All nodes are still "alive". Will that be a problem? Should we increase the allocated memory for yarn cluster?
Created ‎02-16-2016 11:27 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You should look into the option to upgrade your system.
HDP and Ambari needs to be upgraded.
You can increase yarn memory. Restart yarn to see if alert goes away
Created ‎02-16-2016 11:27 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You should look into the option to upgrade your system.
HDP and Ambari needs to be upgraded.
You can increase yarn memory. Restart yarn to see if alert goes away
