Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Can Spark SQL replaces Sqoop for Data Ingestion?

avatar
Frequent Visitor
 
1 ACCEPTED SOLUTION

avatar
Master Collaborator

If the question is academic in nature then certainly, you can.

If it's instead a use-case and if I were to choose between Sqoop and SparkSQL, I'd stick with Sqoop. The reason being Sqoop comes with a lot of connectors which it has direct access to, while Spark JDBC will typically be going in via plain old JDBC and so will be substantially slower and put more load on the target DB. You can also see partition size constraints while extracting data. So, performance and management would certainly be a key in deciding the solution. 

 

Good Luck and let us know which one did you finally prefer and how was your experience. Thx

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

If the question is academic in nature then certainly, you can.

If it's instead a use-case and if I were to choose between Sqoop and SparkSQL, I'd stick with Sqoop. The reason being Sqoop comes with a lot of connectors which it has direct access to, while Spark JDBC will typically be going in via plain old JDBC and so will be substantially slower and put more load on the target DB. You can also see partition size constraints while extracting data. So, performance and management would certainly be a key in deciding the solution. 

 

Good Luck and let us know which one did you finally prefer and how was your experience. Thx

avatar
Frequent Visitor
Thanks a lot!

Finally, Sqoop.. 🙂