Member since
11-18-2021
6
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1190 | 12-15-2021 05:23 AM |
10-08-2022
10:15 PM
1 Kudo
The ReplaceText procesor has a prepend/append mode which might be of help
... View more
02-17-2022
09:05 AM
I have a process that uses the PutBigQueryBatch processor, in which I would like it to truncate the table before inserting the data. I defined an AVRO schema, and previously created the table in BigQuery specifying how I wanted the fields. I am aware that if I change the "Write Disposition" property to the value "WRITE_TRUNCATE", it will truncate the table. However, when I use this option, the schema of the table in BigQuery ends up being deleted, which I would not like to happen, and a new schema is created to record the data. I understand that the "Create Disposition" property exists, and that if the "CREATE_NEVER" option is selected, the schema should be respected and not deleted. When I run this processor with the "Write Disposition" property set to "WRITE_APPEND", the schema I created in BigQuery is respected, but with the "WRITE_TRUNCATE" not. Is there any way to use the "WRITE_TRUNCATE" option and the table schema not be deleted? Am I doing something wrong? Below I forward the configuration that I am using in the PutBigQueryBatch processor:
... View more
Labels:
- Labels:
-
Apache NiFi
-
Schema Registry
12-15-2021
05:23 AM
2 Kudos
I managed to solve it, I used ${field.value:substring(0,23):toDate("yyyy-MM-dd'T'HH:mm:ss.SSS"):format("yyyy-MM-dd HH:mm: ss.SSS")}
... View more