Member since
11-02-2016
3
Posts
0
Kudos Received
0
Solutions
04-24-2020
01:08 PM
But wouldn't that would change this property setting on the cluster level? I do not intend to do that, I want to apply this setting only at this particular session level.
... View more
04-23-2020
04:57 PM
I'm using Spark 2.0.1 on CDH 5.3.2 and I have a spark application that is giving me below error when running: org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1221, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1221. To overcome this I referred to https://github.com/apache/spark/pull/17223 and as mentioned in it, I tried to set the value for hive.exec.max.dynamic.partitions to 5000 and I did that by adding --conf spark.hadoop.hive.exec.max.dynamic.partitions=2000 to the spark-submit command. However I still get the same exact error, it appears that the config is not being applied to the spark application! I do not want to change this value for whole cluster and I just want it to be applied to this spark application, can someone please help me with that?
... View more
Labels:
- Labels:
-
Apache Spark