Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Execute sqoop on NiFi

Solved Go to solution

Execute sqoop on NiFi

Is there any NiFi processor i can use in executing SQOOP?

If none, any data flow i can use in getting my table then save it to HDFS?

Thanks.

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Execute sqoop on NiFi

Hi @regie canada,

If you really want to use Sqoop, then you would need to use something like ExecuteStreamCommand / ExecuteProcess processors. However, this is not something I'd recommend unless you need the features provided by Sqoop.

If you want a solution fully provided by NiFi, then depending on your source database, you can use the JDBC processors to get the data of your table and then use something like PutHDFS to send the data into HDFS. A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.

You can have a look to the documentation here:

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.GenerateTableF...

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.QueryDatabaseT...

You have additional SQL/JDBC processors based on your needs.

This article should get you started:

https://community.hortonworks.com/articles/51902/incremental-fetch-in-nifi-with-querydatabasetable.h...

Hope this helps.

View solution in original post

3 REPLIES 3
Highlighted

Re: Execute sqoop on NiFi

Hi @regie canada,

If you really want to use Sqoop, then you would need to use something like ExecuteStreamCommand / ExecuteProcess processors. However, this is not something I'd recommend unless you need the features provided by Sqoop.

If you want a solution fully provided by NiFi, then depending on your source database, you can use the JDBC processors to get the data of your table and then use something like PutHDFS to send the data into HDFS. A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.

You can have a look to the documentation here:

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.GenerateTableF...

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.QueryDatabaseT...

You have additional SQL/JDBC processors based on your needs.

This article should get you started:

https://community.hortonworks.com/articles/51902/incremental-fetch-in-nifi-with-querydatabasetable.h...

Hope this helps.

View solution in original post

Highlighted

Re: Execute sqoop on NiFi

Explorer

@Pierre Villard: "A common approach is something like GenerateTableFetch on the primary node and QueryDatabaseTable on all nodes. The first processor will generate SQL queries to fetch the data by "page" of specified size, and the second will actually get the data. This way, all nodes of your NiFi cluster can be used to get the data from the database.":

Will I need to make a (local) RPG after the GenerateTableFetch to get them running in parallel?

Any experience on performance for making full RDBMS table dumps using this method vs Sqoop?

Highlighted

Re: Execute sqoop on NiFi

Explorer

Hi @regie canada, check my blog post on this subject

How to run Sqoop from NiFi

Boris

Don't have an account?
Coming from Hortonworks? Activate your account here