When I try to Write a Dataframe to PostgreSQL using Spark Scala, I have noticed that the count on PostgreSQL is always higher than what is get in Spark Scala. The count in spark dataframe is correct & expected.
I have even tried to load the data on monthly basis in parts but the Count in postgreSQL is higher than Spark dataframe
val prop = new java.util.Properties
df.write.mode("Overwrite").jdbc(url= connection, table = "adb.aschema.TABLE", connectionProperties = prop)