Support Questions

Find answers, ask questions, and share your expertise

Hortonworks Data Flow (Apache Nifi)

What is the best mechanism to ingest data from relational sources into HDP. To use a combination of ExecuteSQLand putHDFS processors or to use Sqoop and deliver the data to HDP?

Thanks

1 ACCEPTED SOLUTION

@Greenhorn Techie

You can definitely use Sqoop and it's part of HDP stack.

You can leverage HDF (NiFi) to ingest data into HDP. I this case you have to get on support for HDF and HDP.

View solution in original post

6 REPLIES 6

@Greenhorn Techie

You can definitely use Sqoop and it's part of HDP stack.

You can leverage HDF (NiFi) to ingest data into HDP. I this case you have to get on support for HDF and HDP.

@Neeraj Sabharwal Thanks. My question, which tool is best placed to handle data loading from RDBMS. I understand both of them support. But I would like to understand which one is more capable and advantageous over the other.

Thanks

Vijay

@Greenhorn Techie Sqoop is the most useable tool in the industry as of today for this use case.

Thanks @Neeraj Sabharwal for validating my understanding 🙂

Guru

I would use Sqoop for ingesting RDBMS data as Sqoop will parallelize the ETL job while Nifi will simply run it on the thread that the processor is running on. To do the same thing with Nifi, you would have to create multiple instance of the executeSQL processor and go after partitions of the data you are after.

Mentor

Nifi 0.6 release adds ability to run simple change capture cases with QueryDatabaseTable by maintaining timestamps. This might start turning the needle towards nifi away from sqoop https://cwiki.apache.org/confluence/display/NIFI/Release+Notes