Member since
11-16-2015
902
Posts
664
Kudos Received
249
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
148 | 09-30-2025 05:23 AM | |
617 | 06-26-2025 01:21 PM | |
452 | 06-19-2025 02:48 PM | |
696 | 05-30-2025 01:53 PM | |
9708 | 02-22-2024 12:38 PM |
10-12-2018
03:32 AM
1 Kudo
You can use PutMongoRecord for this, the JsonTreeReader can accept "one JSON per line" (as of NiFi 1.7.0 via NIFI-4456, also HDP 3.2)
... View more
10-10-2018
03:46 PM
session.get() only returns a FlowFile reference (or null if one is not available). Instead you want something like session.get(9) which will return a List<FlowFile> guaranteed not to be null, but it may have size < 9 so your check should work in this case. Having said that, are you sure you want to transfer them to failure? If you're planning on just routing them back to the ExecuteScript (until 9 are available) then you could just do session.rollback() or session.transfer(flowFile, Relationship.SELF)
... View more
10-05-2018
05:32 PM
The ReplaceText puts the CLOB data directly into the content, which is why the error message occurs. You could use a PreparedStatement to get around this, but you very likely don't want to use an attribute as the CLOB is large. The alternative is PutDatabaseRecord, you can provide a JsonTreeRecordReader (with schema) and it will generate the prepared statement but use the content to get the CLOB data rather than an attribute.
If they won't know the schema (but it only contains primitive types) then they could use InferAvroSchema if necessary. However it is preferable to provide the schema directly via the Record Reader and/or a Schema Registry.
... View more
09-24-2018
02:24 PM
You should be able to use ValidateRecord for this, although your reader schema might need to treat the incoming field as an int/long so it will read in negative numbers successfully. Then your write schema can have that field with its correct type (date, e.g.) You can also use QueryRecord for this, it allows you to query the flow file fields using SQL, such as "SELECT * FROM FLOWFILE WHERE columnDate >= 0" or whatever is appropriate.
... View more
09-21-2018
03:02 AM
Can you explain more about your use case? Why do you need to extract ~100 fields, how are they being used downstream? I wonder if a record-based processor or something would be more appropriate...
... View more
09-19-2018
05:54 PM
2 Kudos
If you know the fields, you can use JoltTransformJSON on the original input JSON so you don't have to use SplitJson, here's a spec that will do the (explicit) field name conversion: [
{
"operation": "shift",
"spec": {
"data": {
"Table": {
"*": {
"Age": "data.Table[#2].age",
"FirstName": "data.Table[#2].first_name",
"LastName": "data.Table[#2].last_name"
}
}
}
}
}
] You could also use UpdateRecord, you'd just need separate schemas for the JsonTreeReader and JsonRecordSetWriter.
... View more
09-19-2018
05:17 PM
1 Kudo
You can use JoltTransformJSON to convert this JSON into just the "name: value" pairs used by PutDatabaseRecord, here is a spec that will do it: [{
"operation": "shift",
"spec": {
"columns": {
"*": {
"@(value)": "[#1].@(1,name)"
}
}
}
}] If you find that PutDatabaseRecord is slow, it's likely because you're putting one record at a time into the DB. Instead consider using MergeRecord to bundle together more CDC events, then you can use the following spec to transform all of them for use in PutDatabaseRecord as a "micro-batch": [{
"operation": "shift",
"spec": {
"*": {
"columns": {
"*": {
"@(value)": "[#4].@(1,name)"
}
}
}
}
}]
... View more
09-06-2018
02:27 AM
Maybe try without the string() function around it?! I'm not sure, since I used the transform above and it worked...
... View more
08-29-2018
05:59 PM
3 Kudos
I believe your "total" variable is local to the method, so you won't be able to refer to it later. Try changing the references to "self.total", that will put the variable as a member of the class instance, so you can get at it later with ConvertFilesData.total. Also note that putAttribute() expects a String for the value of the attribute, so you'll need str(ConvertFilesData.total) there.
... View more
08-27-2018
07:27 PM
The NiFi 1.5.0 distribution should only include 1.5.0 NARs, I'm not sure where that 1.1.0 NAR came from, but it should certainly be replaced with the official 1.5.0 release version. You can get the nifi-hive-nar-1.5.0.nar from your NiFi distribution, or you can download it separately from the Apache repo here.
... View more