Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Execute sqoop on NiFi

avatar
Rising Star

Is there any NiFi processor i can use in executing SQOOP?

If none, any data flow i can use in getting my table then save it to HDFS?

Thanks.

1 ACCEPTED SOLUTION

avatar

Hi @regie canada,

If you really want to use Sqoop, then you would need to use something like ExecuteStreamCommand / ExecuteProcess processors. However, this is not something I'd recommend unless you need the features provided by Sqoop.

If you want a solution fully provided by NiFi, then depending on your source database, you can use the JDBC processors to get the data of your table and then use something like PutHDFS to send the data into HDFS. A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.

You can have a look to the documentation here:

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.GenerateTableF...

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.QueryDatabaseT...

You have additional SQL/JDBC processors based on your needs.

This article should get you started:

https://community.hortonworks.com/articles/51902/incremental-fetch-in-nifi-with-querydatabasetable.h...

Hope this helps.

View solution in original post

3 REPLIES 3

avatar

Hi @regie canada,

If you really want to use Sqoop, then you would need to use something like ExecuteStreamCommand / ExecuteProcess processors. However, this is not something I'd recommend unless you need the features provided by Sqoop.

If you want a solution fully provided by NiFi, then depending on your source database, you can use the JDBC processors to get the data of your table and then use something like PutHDFS to send the data into HDFS. A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.

You can have a look to the documentation here:

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.GenerateTableF...

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.QueryDatabaseT...

You have additional SQL/JDBC processors based on your needs.

This article should get you started:

https://community.hortonworks.com/articles/51902/incremental-fetch-in-nifi-with-querydatabasetable.h...

Hope this helps.

avatar
Rising Star

@Pierre Villard: "A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.":

Will I need to make a (local) RPG after the GenerateTableFetch to get them running in parallel?

Any experience on performance for making full RDBMS table dumps using this method vs Sqoop?

avatar
Explorer

Hi @regie canada, check my blog post on this subject

How to run Sqoop from NiFi

Boris