Created 01-24-2017 06:20 AM
Is there any NiFi processor i can use in executing SQOOP?
If none, any data flow i can use in getting my table then save it to HDFS?
Thanks.
Created 01-24-2017 09:30 AM
Hi @regie canada,
If you really want to use Sqoop, then you would need to use something like ExecuteStreamCommand / ExecuteProcess processors. However, this is not something I'd recommend unless you need the features provided by Sqoop.
If you want a solution fully provided by NiFi, then depending on your source database, you can use the JDBC processors to get the data of your table and then use something like PutHDFS to send the data into HDFS. A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.
You can have a look to the documentation here:
You have additional SQL/JDBC processors based on your needs.
This article should get you started:
Hope this helps.
Created 01-24-2017 09:30 AM
Hi @regie canada,
If you really want to use Sqoop, then you would need to use something like ExecuteStreamCommand / ExecuteProcess processors. However, this is not something I'd recommend unless you need the features provided by Sqoop.
If you want a solution fully provided by NiFi, then depending on your source database, you can use the JDBC processors to get the data of your table and then use something like PutHDFS to send the data into HDFS. A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.
You can have a look to the documentation here:
You have additional SQL/JDBC processors based on your needs.
This article should get you started:
Hope this helps.
Created 07-09-2018 10:25 AM
@Pierre Villard: "A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.":
Will I need to make a (local) RPG after the GenerateTableFetch to get them running in parallel?
Any experience on performance for making full RDBMS table dumps using this method vs Sqoop?
Created 02-27-2018 08:58 PM