<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Urgent !!! - Presto Query Error: Query exceeded max memory size of 50GB in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Urgent-Presto-Query-Error-Query-exceeded-max-memory-size-of/m-p/167418#M129750</link>
    <description>&lt;P&gt;We are getting below error, can any one please help me how i can debug the problem and identify root course of error &lt;/P&gt;&lt;P&gt;Presto
Query Error: Query exceeded max memory size of 50GB,&lt;/P&gt;&lt;P&gt;Also i know the simple approach is to increase the ram
allocation from 50GB  but there are some concerns&lt;/P&gt;&lt;P&gt;1. How we can come to know what is optimum size of memory/ram allocation.&lt;/P&gt;&lt;P&gt;2. Suppose even if we increase it to 100GB then
there is no guarantee that the user will not receive the error again of out of
memory.&lt;/P&gt;&lt;P&gt;3. Is there any way to restrict user from launching such a
huge query or any other preventive approach. OR (limit to end user that we cannot process any
query which require more than 100GB or 150GB.)&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Bhupesh Khanna &lt;/P&gt;</description>
    <pubDate>Thu, 09 Feb 2017 14:37:39 GMT</pubDate>
    <dc:creator>bkhanna</dc:creator>
    <dc:date>2017-02-09T14:37:39Z</dc:date>
  </channel>
</rss>

