I'm trying to migrate data from an Oracle Database to a PostGres Database. I'm using an example I found online, but it's very vague on how the Processors need to be set up. The red text under the graphic is the instructions on the diagram.
First of all, both of my DB Connections are working, I've verified them. I have my ListDatabaseTables set up to get 4 tables, and I can view the table names in the view state of the processor. The UpdateAttribute processor has had 2 attributes added per the instructions. I don't really know what else to do with it. 4 flow files are created though and flow through successfully. The next processor, ExecuteSQL, uses the following SQL Select Query: SELECT DBMS_METADATA.GETDDL('TABLE',u.table_name) FROM USER_ALL_TABLES u WHERE u.nested='NO' AND (u.iot_type is null or u.iot_type='IOT')
I don't see where I'm getting any output, and when I run the SQL in SQL Developer, I get nothing. Again though, 4 flow files successfully pass through. The ConvertAvroToJSON processor, I just used the defaults. Again, 4 flow files successfully pass through. I don't know what I'm doing, I'm very new to this. I'm looking at the data provenance and see things happening, but I really don't know what is happening and I can't find out how to view the contents of the flow files to see what is being passed through. It has the appearance of working, but how do I prove it, and what am I doing wrong? I haven't gotten past the ConvertAvroToJSON processor portion of the example because I've stopped there. However, the ExecuteScript processor has an error in it because I have no script, again, still learning!
I received some sort of graphic on my NiFi question in the Cloudera Community board that said you were the person to go to. https://community.cloudera.com/t5/Support-Questions/Need-help-with-Database-migration-with-NiFi-1-9-.... Would you mind taking a look and letting me know what I'm missing?
Assuming you're referring to the icon to the right of the text
last edited on 12-05-2019 10:35 PM by, all it it telling you is that I (a moderator) edited your post. In this case, it was to add the label at the top so that our internal search index creator knows that your question is related to NiFi.
You can download and view the data from any connection to inspect your data as it passed through your dataflow. You can also do the same by looking at each reported provenance event; however, access to view or download the content of a FlowFile via a Provenance event is only possible if that cntent still exists in the NiFi content_repository.
Here is my suggestion:
1. Stop all the processors in your dataflow
2. Start only the first processor and you will see data queue on the connection leading to the next processor.
3. Right click on the connection with the queued data and select "List Queue" form the context menu that is displayed.
4. You can click on the "view details' icon to far left side of any listed FlowFile from the table displayed
5. From the "FlowFile" UI that is displayed you can select the "Download" or "View" buttons to get access to the content as it exists at this point in your dataflow.
When you are done examining the content, repeat steps 3-5 after starting the next processor in your dataflow. This allows you to see how your content is changing s it progresses through your dataflow one processor at a time.
Hope this helps you,