Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hortonworks Data Flow (Apache Nifi)

avatar
Rising Star

What is the best mechanism to ingest data from relational sources into HDP. To use a combination of ExecuteSQLand putHDFS processors or to use Sqoop and deliver the data to HDP?

Thanks

1 ACCEPTED SOLUTION

avatar
Master Mentor
@Greenhorn Techie

You can definitely use Sqoop and it's part of HDP stack.

You can leverage HDF (NiFi) to ingest data into HDP. I this case you have to get on support for HDF and HDP.

View solution in original post

6 REPLIES 6

avatar
Master Mentor
@Greenhorn Techie

You can definitely use Sqoop and it's part of HDP stack.

You can leverage HDF (NiFi) to ingest data into HDP. I this case you have to get on support for HDF and HDP.

avatar
Rising Star

@Neeraj Sabharwal Thanks. My question, which tool is best placed to handle data loading from RDBMS. I understand both of them support. But I would like to understand which one is more capable and advantageous over the other.

Thanks

Vijay

avatar
Master Mentor

@Greenhorn Techie Sqoop is the most useable tool in the industry as of today for this use case.

avatar
Rising Star

Thanks @Neeraj Sabharwal for validating my understanding 🙂

avatar
Guru

I would use Sqoop for ingesting RDBMS data as Sqoop will parallelize the ETL job while Nifi will simply run it on the thread that the processor is running on. To do the same thing with Nifi, you would have to create multiple instance of the executeSQL processor and go after partitions of the data you are after.

avatar
Master Mentor

Nifi 0.6 release adds ability to run simple change capture cases with QueryDatabaseTable by maintaining timestamps. This might start turning the needle towards nifi away from sqoop https://cwiki.apache.org/confluence/display/NIFI/Release+Notes