Member since
01-13-2018
2
Posts
3
Kudos Received
0
Solutions
01-14-2018
09:30 AM
2 Kudos
Hi Thanks for the answer, it works. Thanks Vignesh Asokan
... View more
01-13-2018
08:51 PM
1 Kudo
I Am trying to get data-set from a existing non partitioned hive table and trying an insert into partitioned Hive external table. How do i do that in Pyspark Sql.? Any help would be appreciated, I am currently using the below command. The Hive External table has multiple partitions. df.write.mode("overwrite").partitionBy("col1","col2").insertInto("Hive external Partitioned Table") The spark job is running successfully but no data is written to the HDFS partitions of the Hive external table.
... View more
Labels:
- Labels:
-
Apache Hive