<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Unable to set hive.exec.max.dynamic.partitions while starting spark-shell in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/294658#M217373</link>
    <description>&lt;P&gt;I'm using Spark 2.0.1 on CDH 5.3.2 and I have a spark application that is giving me below error when running:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1221, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1221.&lt;/PRE&gt;&lt;P&gt;To overcome this I referred to &lt;A href="https://github.com/apache/spark/pull/17223" target="_blank" rel="noopener"&gt;https://github.com/apache/spark/pull/17223&lt;/A&gt;&amp;nbsp;and as mentioned in it, I&amp;nbsp;tried to set the value for&amp;nbsp;hive.exec.max.dynamic.partitions to 5000 and I did that by adding&amp;nbsp;&lt;SPAN&gt;--conf spark.hadoop.hive.exec.max.&lt;/SPAN&gt;&lt;SPAN&gt;dynamic.partitions=2000 to the spark-submit command. However I still get the same exact error, it appears that the config is not being applied to the spark application! I do not want to change this value for whole cluster and I just want it to be applied to this spark application, can someone please help me with that?&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 23 Apr 2020 23:57:57 GMT</pubDate>
    <dc:creator>hdp_usr</dc:creator>
    <dc:date>2020-04-23T23:57:57Z</dc:date>
    <item>
      <title>Unable to set hive.exec.max.dynamic.partitions while starting spark-shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/294658#M217373</link>
      <description>&lt;P&gt;I'm using Spark 2.0.1 on CDH 5.3.2 and I have a spark application that is giving me below error when running:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1221, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1221.&lt;/PRE&gt;&lt;P&gt;To overcome this I referred to &lt;A href="https://github.com/apache/spark/pull/17223" target="_blank" rel="noopener"&gt;https://github.com/apache/spark/pull/17223&lt;/A&gt;&amp;nbsp;and as mentioned in it, I&amp;nbsp;tried to set the value for&amp;nbsp;hive.exec.max.dynamic.partitions to 5000 and I did that by adding&amp;nbsp;&lt;SPAN&gt;--conf spark.hadoop.hive.exec.max.&lt;/SPAN&gt;&lt;SPAN&gt;dynamic.partitions=2000 to the spark-submit command. However I still get the same exact error, it appears that the config is not being applied to the spark application! I do not want to change this value for whole cluster and I just want it to be applied to this spark application, can someone please help me with that?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 23 Apr 2020 23:57:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/294658#M217373</guid>
      <dc:creator>hdp_usr</dc:creator>
      <dc:date>2020-04-23T23:57:57Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to set hive.exec.max.dynamic.partitions while starting spark-shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/294666#M217377</link>
      <description>&lt;P&gt;Can you add below property at &amp;lt;spark_home&amp;gt;/conf/hive-site.xml and &amp;lt;hive-home&amp;gt;/conf/hive-site.xml&lt;/P&gt;&lt;P&gt;hive.exec.max.dynamic.partitions=2000&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;    &amp;lt;name&amp;gt;hive.exec.max.dynamic.partitions&amp;lt;/name&amp;gt;
    &amp;lt;value&amp;gt;2000&amp;lt;/value&amp;gt;
    &amp;lt;description&amp;gt;&amp;lt;/description&amp;gt;&lt;/PRE&gt;&lt;P&gt;Hope this helps. Please accept the answer and vote up if it did.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Note&lt;/STRONG&gt;&lt;SPAN&gt;: Restart HiveServer2 and Spark History Server if it didn't work.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;-JD&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 24 Apr 2020 02:57:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/294666#M217377</guid>
      <dc:creator>jagadeesan</dc:creator>
      <dc:date>2020-04-24T02:57:26Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to set hive.exec.max.dynamic.partitions while starting spark-shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/294719#M217405</link>
      <description>&lt;P&gt;But wouldn't that would change this property setting on the cluster level? I do not intend to do that, I want to apply this setting only at this particular session level.&lt;/P&gt;</description>
      <pubDate>Fri, 24 Apr 2020 20:08:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/294719#M217405</guid>
      <dc:creator>hdp_usr</dc:creator>
      <dc:date>2020-04-24T20:08:25Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to set hive.exec.max.dynamic.partitions while starting spark-shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/297496#M218726</link>
      <description>&lt;P&gt;You can try with&amp;nbsp;&lt;EM&gt;spark-shell --conf spark.hadoop.hive.exec.max.dynamic.partitions=xxxxx.&amp;nbsp;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;$ spark-shell --conf spark.hadoop.hive.exec.max.dynamic.partitions=30000&lt;BR /&gt;Setting default log level to "WARN".&lt;BR /&gt;To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).&lt;BR /&gt;Spark context Web UI available at http://hostname:port&lt;BR /&gt;Spark context available as 'sc' (master = yarn, app id = application_xxxxxxxxxxxx_xxxx).&lt;BR /&gt;Spark session available as 'spark'.&lt;BR /&gt;Welcome to&lt;BR /&gt;____ __&lt;BR /&gt;/ __/__ ___ _____/ /__&lt;BR /&gt;_\ \/ _ \/ _ `/ __/ '_/&lt;BR /&gt;/___/ .__/\_,_/_/ /_/\_\ version 2.x.x.x.x.x.x-xx&lt;BR /&gt;/_/&lt;BR /&gt;&lt;BR /&gt;Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)&lt;BR /&gt;Type in expressions to have them evaluated.&lt;BR /&gt;Type :help for more information.&lt;BR /&gt;&lt;BR /&gt;scala&amp;gt; spark.sqlContext.getAllConfs.get("spark.hadoop.hive.exec.max.dynamic.partitions")&lt;BR /&gt;res0: Option[String] = Some(30000)&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;Ref:&amp;nbsp;&lt;A href="https://issues.apache.org/jira/browse/SPARK-21574?focusedCommentId=16106650&amp;amp;page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16106650" target="_blank" rel="13090977 noopener"&gt;SPARK-21574&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 06 Jun 2020 02:58:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-set-hive-exec-max-dynamic-partitions-while/m-p/297496#M218726</guid>
      <dc:creator>jagadeesan</dc:creator>
      <dc:date>2020-06-06T02:58:35Z</dc:date>
    </item>
  </channel>
</rss>

