Support Questions

Find answers, ask questions, and share your expertise

Can Spark SQL replaces Sqoop for Data Ingestion?

avatar
Contributor
 
1 ACCEPTED SOLUTION

avatar
Master Collaborator

If the question is academic in nature then certainly, you can.

If it's instead a use-case and if I were to choose between Sqoop and SparkSQL, I'd stick with Sqoop. The reason being Sqoop comes with a lot of connectors which it has direct access to, while Spark JDBC will typically be going in via plain old JDBC and so will be substantially slower and put more load on the target DB. You can also see partition size constraints while extracting data. So, performance and management would certainly be a key in deciding the solution. 

 

Good Luck and let us know which one did you finally prefer and how was your experience. Thx

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

If the question is academic in nature then certainly, you can.

If it's instead a use-case and if I were to choose between Sqoop and SparkSQL, I'd stick with Sqoop. The reason being Sqoop comes with a lot of connectors which it has direct access to, while Spark JDBC will typically be going in via plain old JDBC and so will be substantially slower and put more load on the target DB. You can also see partition size constraints while extracting data. So, performance and management would certainly be a key in deciding the solution. 

 

Good Luck and let us know which one did you finally prefer and how was your experience. Thx

avatar
Contributor
Thanks a lot!

Finally, Sqoop.. 🙂