Myself and a colleague of mine noticed a similar behaviour with BLOB's and a legacy Oracle database we were using (Community link here). To extract the data we built an external Java program (jdbc + ResultSets) to get the BLOB bytes and write files to disk for NiFi ingestion. Alternatively, using NiFi's ExecuteScript processor and Groovy's database helper libraries the entire process can be managed through NiFi (or whatever scripting language you are comfortable with). Whichever of the above techniques you choose, you will still have to maintain your own business logic to manage your delta records. Not the most elegant solution but we just could not get the bundled NiFi processers to play nicely with the Oracle BLOB's.
used both QueryDatabaseTable and ExecuteSQL. The CLOB is large and our
quick and dirty workaround is to create a view but I had to limit the
column to 1000 (see below) as the processors still had an error with the
max 4000 that Oracle could handle (converting a CLOB).