Created on 07-08-2021 01:50 AM - edited 07-08-2021 01:52 AM
I have a pyspark job with these configs:
self.spark = SparkSession.builder.appName("example") \
.config("hive.exec.dynamic.partition", "true") \
.config("hive.exec.dynamic.partition.mode", "nonstrict") \
.config("hive.exec.max.dynamic.partitions", "5000000") \
.config("hive.exec.max.dynamic.partitions.pernode", "1000000") \
.enableHiveSupport() \
.getOrCreate()
I can not find anywhere how to set a configuration to increase the max row size to 150mb. I found the command only in impala.
Can you please help me ?
Thanks in advance.
Created 07-11-2021 06:37 PM
Created 07-11-2021 06:37 PM
Hello
The max_row_size parameter is Impala specific parameter