Member since
08-25-2017
4
Posts
0
Kudos Received
0
Solutions
09-06-2017
04:26 PM
@Julien Champ I think you are on the right track. This is how we doing things in Hortonworks Data Cloud. We have 3 type of nodes there: master, workers, computes and scaling the computes up and down. One thing which you should investigate that where you want to store your data because I probably suggest a cloud object store which is supported by Hortonworks so s3 (AWS) or adls (Azure). Br, R
... View more
08-30-2017
08:38 AM
Thanks @rkovacs I've been able to found a usefull solution thanks to you. I've used a slightly modified version of https://github.com/hortonworks/cloudbreak/blob/release-1.16/autoscale/src/main/resources/alerts/allocated_memory.ftl To trigger the UP scaling when the amount of allocated memory goes over 95% ( this triggers a critical alert ) I've just modified the aforementionned .ftl file to something more correct I think : we want an alert when the % of allocated memory goes over 95% and not when the % of remaining memory goes over 95% . I've changed : "value": "{0}/({0} + {1}) * 100" in : "value": "{1}/({0} + {1}) * 100" I'm going to open a new question about Scaling down now ! 🙂 i.e. : how to scale down 1 node is useless regarding to memory usage + a margin Thanks again !
... View more