Member since
08-14-2019
8
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4222 | 08-23-2019 08:25 AM |
12-06-2019
07:01 AM
Bill, I received some sort of graphic on my NiFi question in the Cloudera Community board that said you were the person to go to. https://community.cloudera.com/t5/Support-Questions/Need-help-with-Database-migration-with-NiFi-1-9-2/td-p/284907. Would you mind taking a look and letting me know what I'm missing? Thanks, Doug
... View more
12-05-2019
10:57 AM
I'm trying to migrate data from an Oracle Database to a PostGres Database. I'm using an example I found online, but it's very vague on how the Processors need to be set up. The red text under the graphic is the instructions on the diagram.
Get Table List
ListDatabaseTables: There may be a need to migrate all tables or some of the tables only. So, the initial task is to list the tables to migrate. Use ListDatabaseTables processor to fetch the list of tables to be migrated. Edit the properties of the processor to connect to source database and filter out the list of tables. Properties: Catalog, Schema Pattern, Table Name Pattern and Table Types to filter out the list of tables. The processor outputs a single flowfile for each table in the list, and to each flowfile, adds the following properties: db.table.name, db.table.catalog, db.table.schema, db.table.name. This processor adds the details of source database and table associated with the flowfile.
UpdateAttribute: The ListDatabaseTables processor adds the table properties for the source database system. To add the destination database properties, use UpdateAttrbiute processor. Add following properties: ‘destination_table_catalog’ and ‘destination_table_schema’. Table name will be same as in the source database (db.table.name).
Create Table
ExecuteSQL: After each flowfile has the information of the source database, the destination database, and the table to migrate, the next step is to create the table in the destination database. Use the ExecuteSQL processor to fetch the table schema of the table from the source database. The SQL statement to fetch the table schema may vary as per the type of the source database. The ExecuteSQL outputs the Avro flowfile which contains table schema.
ConvertAvroToJson: The Avro format flowfile is converted to Json format using the ConvertAvroToJson processor.
ExecuteScript: Use ExecuteScript processor to generate Data Definition Language (DDL) statement to create table for the destination database. Through the ExecuteScript processor, scripting language such as python, groovy, ruby etc. can be used to read the Json format schema of the table and create the DDL statement.
PutSQL: To execute the DDL statement to create table in the destination database, use PutSQL processor.
Extract Data
ExecuteSQL: Use ExecuteSQL processor to fetch table data from the source database. Add the SQL select statement ‘select * from ${db.table.catalog}.${db.table.schema}.${db.table.name};’. This processor outputs the Avro format flowfile with all the data of the source table. Alternatively, GenerateTableFetch and QueryDatabase processors can be used to extract data, which will be discussed in the next blog post.
Ingest Data
PutDatabaseRecord: The PutDatabaseRecord processor reads the Avro data, creates a batch upload and uploads the data to the table created in the destination table. Data ingestion can also be performed using ConvertAvroToJson and PutSQL processor. However, PutDatabaseRecord is better and faster way to upload data. PutSQL processor can possibly be used if the extracted data requires some transformations.
First of all, both of my DB Connections are working, I've verified them. I have my ListDatabaseTables set up to get 4 tables, and I can view the table names in the view state of the processor. The UpdateAttribute processor has had 2 attributes added per the instructions. I don't really know what else to do with it. 4 flow files are created though and flow through successfully. The next processor, ExecuteSQL, uses the following SQL Select Query: SELECT DBMS_METADATA.GETDDL('TABLE',u.table_name) FROM USER_ALL_TABLES u WHERE u.nested='NO' AND (u.iot_type is null or u.iot_type='IOT')
I don't see where I'm getting any output, and when I run the SQL in SQL Developer, I get nothing. Again though, 4 flow files successfully pass through. The ConvertAvroToJSON processor, I just used the defaults. Again, 4 flow files successfully pass through. I don't know what I'm doing, I'm very new to this. I'm looking at the data provenance and see things happening, but I really don't know what is happening and I can't find out how to view the contents of the flow files to see what is being passed through. It has the appearance of working, but how do I prove it, and what am I doing wrong? I haven't gotten past the ConvertAvroToJSON processor portion of the example because I've stopped there. However, the ExecuteScript processor has an error in it because I have no script, again, still learning!
... View more
Labels:
- Labels:
-
Apache NiFi
11-26-2019
05:18 AM
I'm having an issue where my ListDatabaseTables processor is not returning anything. I know my DBCPConnection is set up properly because I have another processor using it and am able to get output. I'm using Nifi 1.9.2 locally set up on my windows 10 machine and it's connected to an Oracle 18c database. Catalog and Schema Pattern are blank, Table name pattern: %LU% (I want all tables with 'LU' in the name), Table Types: Table, Include Count: False, Refresh Interval: 0 sec
... View more
Labels:
- Labels:
-
Apache NiFi
09-10-2019
07:12 AM
I to am attempting to migrate from Oracle to Postgres. I am using the template this blogger recommends but am unable to get it to work with NiFi 1.9.2 https://blog.pythian.com/database-migration-using-apache-nifi/ Any help would be much appreciated!
... View more
08-23-2019
08:25 AM
Thanks to all for your insights. What I ended up doing was installing NiFi locally on my machine, still connected to my database server on the network. I had to mess with the DB Connection Pool a bit to get the correct combination of Database Connection URL and userid and password, but I got it to write successfully to my hard drive locally. Fortunately, I had a limit of 45 files set for the directory in my PutFile processor because I got 45 flowfiles in a heartbeat. So, if you're testing this out, MAKE SURE YOU LIMIT YOUR NUMBER OF FILES before you attempt to run your processes!!!
... View more
08-19-2019
12:44 PM
More info: NiFi installed on VM, not connected to internet. copy of oracle test db on vm as well. should I use NiFi scripts to point to local drive? I need admin account to grant access to drive.
... View more
08-15-2019
03:46 PM
Brand new NiFi user here. I created a process group and have the QueryDatabaseTable → PutFile processors linked. I funnel the output for failures and successes from the PutFile processor. I'm getting data passed through without issue, however, it is not writing the output file to my C:\users\<username>\nifi_output on my local machine. I have changed the configuration of the PutFile to try \\<machine-name>\C$\users\<username>\nifi_output for the output file and that hasn't worked either. I haven't received any errors. Like I said, I'm a newbie, so I don't know where to look any further. Thanks!
... View more
Labels:
- Labels:
-
Apache NiFi