Support Questions

Find answers, ask questions, and share your expertise

Is there any tools/recommendations available to set the propotionate heap memory in the blueprint based on System memory ?

avatar

Working on setting up installing blueprint to deploy the cluster via Chef. Is there any general recomendation to set the heap values based on the available system memory ?

say for example

say if the system memory is 100 GB. then KAFKA_HEAP_MEMORY is set to system_memory/20

1 ACCEPTED SOLUTION

avatar

Not really. We are working on integrating StackAdvisor capability to blueprint based deployment in the next release. That will automatically modify the given blueprint to set correct defaults. Kafka team may have some recommendations.

View solution in original post

3 REPLIES 3

avatar

Not really. We are working on integrating StackAdvisor capability to blueprint based deployment in the next release. That will automatically modify the given blueprint to set correct defaults. Kafka team may have some recommendations.

avatar

thanks @smohanty@hortonworks.com. Do we have the logic to be used atleast .I would like to use it . For Kafka i guess anything above 5GB is not of much benefit.I am mainly working on AMS,Storm,HBase and HDFS.

avatar
Expert Contributor

As Sumit mentioned, we currently do not use the StackAdvisor output in Blueprints, but will support this in a future release.

If you have the hardware to experiment with, you can try deploying a cluster with the UI (which will cause the recommendations to be applied), and then export the Blueprint from the running cluster. You can then use the Kafka config in the exported Blueprint as a starting point.

The Blueprints wiki contains the REST call used for exporting a Blueprint:

https://cwiki.apache.org/confluence/display/AMBARI/Blueprints#Blueprints-APIResourcesandSyntax