Support Questions

Find answers, ask questions, and share your expertise

Can Spark SQL replaces Sqoop for Data Ingestion?

Explorer
 
1 ACCEPTED SOLUTION

Expert Contributor

If the question is academic in nature then certainly, you can.

If it's instead a use-case and if I were to choose between Sqoop and SparkSQL, I'd stick with Sqoop. The reason being Sqoop comes with a lot of connectors which it has direct access to, while Spark JDBC will typically be going in via plain old JDBC and so will be substantially slower and put more load on the target DB. You can also see partition size constraints while extracting data. So, performance and management would certainly be a key in deciding the solution. 

 

Good Luck and let us know which one did you finally prefer and how was your experience. Thx

View solution in original post

2 REPLIES 2

Expert Contributor

If the question is academic in nature then certainly, you can.

If it's instead a use-case and if I were to choose between Sqoop and SparkSQL, I'd stick with Sqoop. The reason being Sqoop comes with a lot of connectors which it has direct access to, while Spark JDBC will typically be going in via plain old JDBC and so will be substantially slower and put more load on the target DB. You can also see partition size constraints while extracting data. So, performance and management would certainly be a key in deciding the solution. 

 

Good Luck and let us know which one did you finally prefer and how was your experience. Thx

Explorer
Thanks a lot!

Finally, Sqoop.. 🙂
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.