Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to set hive.exec.max.dynamic.partitions while starting spark-shell

avatar
New Contributor

I'm using Spark 2.0.1 on CDH 5.3.2 and I have a spark application that is giving me below error when running:

 

org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1221, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1221.

To overcome this I referred to https://github.com/apache/spark/pull/17223 and as mentioned in it, I tried to set the value for hive.exec.max.dynamic.partitions to 5000 and I did that by adding --conf spark.hadoop.hive.exec.max.dynamic.partitions=2000 to the spark-submit command. However I still get the same exact error, it appears that the config is not being applied to the spark application! I do not want to change this value for whole cluster and I just want it to be applied to this spark application, can someone please help me with that?

3 REPLIES 3

avatar
Master Collaborator

Can you add below property at <spark_home>/conf/hive-site.xml and <hive-home>/conf/hive-site.xml

hive.exec.max.dynamic.partitions=2000

 

    <name>hive.exec.max.dynamic.partitions</name>
    <value>2000</value>
    <description></description>

Hope this helps. Please accept the answer and vote up if it did.

Note: Restart HiveServer2 and Spark History Server if it didn't work.

 

-JD

avatar
New Contributor

But wouldn't that would change this property setting on the cluster level? I do not intend to do that, I want to apply this setting only at this particular session level.

avatar
Master Collaborator

You can try with spark-shell --conf spark.hadoop.hive.exec.max.dynamic.partitions=xxxxx. 

 

$ spark-shell --conf spark.hadoop.hive.exec.max.dynamic.partitions=30000
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://hostname:port
Spark context available as 'sc' (master = yarn, app id = application_xxxxxxxxxxxx_xxxx).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.x.x.x.x.x.x-xx
/_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.

scala> spark.sqlContext.getAllConfs.get("spark.hadoop.hive.exec.max.dynamic.partitions")
res0: Option[String] = Some(30000)

 Ref: SPARK-21574