Support Questions
Find answers, ask questions, and share your expertise

org.apache.spark.SparkException: Requested partitioning does not match the <table>

org.apache.spark.SparkException: Requested partitioning does not match the <table>

Explorer

Hi team , 

 

I am trying to load data into hive ,but getting below error ,any help is appreciated

 

While running the below code snippet via spark-submit in yarn cluster mode,

 

df.write.format("hive").mode("append").partitionBy("c1", "c2", "c3").saveAsTable(s"table1")

 

we are getting the below error 

 

org.apache.spark.SparkException: Requested partitioning does not match the a table:

Requested partitions:

Table partitions: c1,c2,c3

 

This was working fine before we upgrading from HDP2.6.4 to 2.6.5

 

after upgrading to 2.6.5 it is not working 

 

Thanks