Member since
11-02-2016
3
Posts
0
Kudos Received
0
Solutions
06-05-2020
07:58 PM
1 Kudo
You can try with spark-shell --conf spark.hadoop.hive.exec.max.dynamic.partitions=xxxxx. $ spark-shell --conf spark.hadoop.hive.exec.max.dynamic.partitions=30000 Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://hostname:port Spark context available as 'sc' (master = yarn, app id = application_xxxxxxxxxxxx_xxxx). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.x.x.x.x.x.x-xx /_/ Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112) Type in expressions to have them evaluated. Type :help for more information. scala> spark.sqlContext.getAllConfs.get("spark.hadoop.hive.exec.max.dynamic.partitions") res0: Option[String] = Some(30000) Ref: SPARK-21574
... View more