Created on 01-07-2019 08:21 PM - edited 09-16-2022 07:02 AM
Team,
I have 1000 tables in my source RDBMS and I would like to get them migrated to hive using pyspark
I read through documentation and found that below two commands would help. Is there a way I can loop these two commands 1000 times if I have all the list of tables in a python array?
arr = ("table1","table2")
for x in arr:
df = spark.read.format("jdbc").blah.blah
data.write.saveAsTable.blah.blah
If someone has a working solution for this could you please share. I tried but it is not throwing any error but at same time not writing anything.
Thanks
Meher
Created 01-08-2019 09:11 AM
Created 01-08-2019 09:11 AM
I'm able to get this working. Will close this post.
Thanks,
Meher
Created 01-09-2019 05:57 AM
@Meher I am happy to see that you resolved your issue. Would you mind sharing how you solved it in case someone else encounters the same situation?