Member since
10-14-2016
6
Posts
0
Kudos Received
0
Solutions
05-25-2017
02:20 PM
1 Kudo
That query probably has multiple big joins and aggregations and needs more memory to complete. A very rough rule of thumb for minimum memory in releases CDH5.9-CDH5.12 is the following. For each hash join, the minimum of 150MB or the amount of data on the right side of the node (e.g. if you have a few thousand rows on the right side, maybe a MB or two). For each merge aggregation, the minimum of 300MB or the size of grouped data in-memory (e.g. if you only have a few thousand groups, maybe a MB or two). For each sort, about 50-60MB For each analytic, about 20MB If you add all those up and add another 25% you'll get a ballpark number for how much memory the query will require to execute. I'm working on reducing those numbers and making the system give a clearer yes/no answer on whether it can run the query before it starts executing.
... View more