Member since
04-05-2016
188
Posts
19
Kudos Received
11
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
914 | 10-30-2017 07:05 AM | |
1197 | 10-12-2017 07:03 AM | |
4860 | 10-12-2017 06:59 AM | |
7127 | 03-01-2017 09:56 AM | |
21304 | 01-26-2017 11:52 AM |
01-26-2017
06:09 AM
Hi everyone, i was trying to insert into Oracle 12c DB and i got the error "cannot create PoolableConnectionFactory (ORA-28000: the account is locked)". I have tried to run the insert statement generated manually and it worked fine but NiFi gives the account is locked error. What could possibly the cause of this error?
... View more
Labels:
- Labels:
-
Apache NiFi
01-11-2017
12:57 PM
I used the fragment.index attribute to remove the headers and successfully loaded into Oracle DB. Thank you @Matt and @Matt Burgess.
... View more
01-11-2017
10:01 AM
@Matt I have not found a way to remove the headers in the csv file.
... View more
01-10-2017
06:28 PM
@Matt Please how do i extract the headers without using the SplitText processor?
... View more
01-10-2017
11:25 AM
Hi @Matt Burgess, the first SQL statement is the issue. I see it's taking the headers in the csv file even though i skipped the headers with the header line count parameter in SplitText. 2017-01-10 12:07:13,538 ERROR [Timer-Driven Process Thread-40] o.apache.nifi.processors.standard.PutSQL PutSQL[id=79bf32ff-e154-1a02-b109-ebd298dfab2e] Failed to update database due to a failed batch update. There were a total of 1 FlowFiles that failed, 0 that succeeded, and 99 that were not execute and will be routed to retry;
... View more
01-10-2017
10:15 AM
Hi @Matt, there is no OOME error in the NiFi app log. Also, the SplitText processor is able to successfully split the files but the issue seems to be at the PutSQL processor. I can see that although i asked that SplitText processor skip 2 lines in the header line count parameter, the insert statements into the PutSQL have the csv file headers as the values.
... View more
01-09-2017
10:04 AM
I have a flow diagram that picks zipped csv files and loads into a table in Oracle 12c. This has been tested successfully for 20 records but when i try it with a larger csv file (over 40k records), i get an error message "Failed to update DB due to a failed batch update. There were a total of 1 flowfiles that failed, 0 that succeeded and 99 that were not executed and will be routed to retry;". I currently have the batch size set to 100 for the PutSQL processor. Is there any setting i am missing? GetSFTP---UnpackContent---SplitText---ExtractText---ReplaceText---PutSQL
... View more
Labels:
- Labels:
-
Apache NiFi
12-30-2016
08:20 PM
@Chris Nauroth Then there must be an issue if after 7 hours there is still no reduction in HDFS storage and it still keeps the /trash/finalized folder...
... View more
12-30-2016
02:58 PM
1 Kudo
@Chris Nauroth Hi, how soon does the normal block deletion start once the upgrade is finalized?
... View more
12-22-2016
01:35 PM
@David Kjerrumgaard I found out the issue was with my csv file. It had to do with the column names i was ingesting in the csv file. Also, the turning the Obtain Generated Keys property to false worked as well in my case.
... View more