Reply
Highlighted
New Contributor
Posts: 12
Registered: ‎09-09-2017

PySpark -HiveContext : Cannot write into the same table it is read from

Hi Techies,

 

I am trying to implement SCD using PySpark. I am using SQL based approach in which I am reading source and taget table. Joining them and inseting new records and updating the old one using case operation. 

 

Now when I am going to write the data into the same target table, I am getting issue that, it can not write into the table it is read from. Is there any solution for this. I do not want to write in a different physical table and then overwriting. 

 

Please guide.if there is any other work around. Is there any way to remove dependency from the previous operations as I am using registerTempTable. If the data can persist in a way that I will not read from above operations, it will help!!

 

Thanks,

Manu

 

 

 

Announcements