Create column using withColumn function with literal value as 12.
Use month column as partitionby column and use insertInto table.
df.withColumn("month",lit(12)).write.mode("<append or overwrite>").partitionBy("month").insertInto("<hive_table_name>")
Using SQL query
df.createOrReplaceTempView("temp_table") spark.sql("insert into <partition_table> partition(`month`=12) select * from <temp_table>")
If the answer is helpful to resolve the issue, Login and Click on Accept button below to close this thread.This will help other community users to find answers quickly 🙂
Thnaks, but i am already doing this in HDP2 and it worked, in HDP3.0 and hive 3.0 it is giving error when i am writing spark dataframe into hive portioned table . it says permission denied. any help?