Support Questions
Find answers, ask questions, and share your expertise

Error metadata.Hive: MetaException

Error metadata.Hive: MetaException

Explorer

Hi,

I have opened a pyspark shell which includes one query and every time after ~15mins it throws an error like below:

 

ERROR metadata.Hive: MetaException(message:Timeout when executing method: get_partitions; 1788247ms exceeds 600000ms)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partitions_result$get_partitions_resultStandardScheme.read(ThriftHiveMetastore.java)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partitions_result$get_partitions_resultStandardScheme.read(ThriftHiveMetastore.java)

 

 

The configs are these:

 

spark = SparkSession.builder.appName("home_office") \
    .config("hive.exec.dynamic.partition", "true") \
    .config("hive.exec.dynamic.partition.mode", "nonstrict") \
    .config("hive.exec.compress.output=false", "false") \
    .config("spark.unsafe.sorter.spill.read.ahead.enabled", "false") \
    .config("spark.debug.maxToStringFields", 1000)\
    .enableHiveSupport() \
    .getOrCreate()

 

 
Can someone explain what means the error I get every time?

Thanks in advance.