I have a pyspark job with these configs:
self.spark = SparkSession.builder.appName("example") \
.config("hive.exec.dynamic.partition", "true") \
.config("hive.exec.dynamic.partition.mode", "nonstrict") \
.config("hive.exec.max.dynamic.partitions", "5000000") \
.config("hive.exec.max.dynamic.partitions.pernode", "1000000") \
.enableHiveSupport() \
.getOrCreate()
I can not find anywhere how to set a configuration to increase the max row size to 150mb. I found the command only in impala.
Can you please help me ?
Thanks in advance.