Member since
01-27-2023
229
Posts
73
Kudos Received
45
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1334 | 02-23-2024 01:14 AM | |
| 1706 | 01-26-2024 01:31 AM | |
| 1125 | 11-22-2023 12:28 AM | |
| 2775 | 11-22-2023 12:10 AM | |
| 2859 | 11-06-2023 12:44 AM |
07-11-2023
11:58 PM
Have you extracted the JDBC zip (not the ODBC) correctly? Have you pointed NiFi to the full path of the folder? Have you defined the Database Connection URL, Database Driver Class Name, Database Driver Location correctly?
... View more
07-11-2023
04:21 AM
A screenshot of what? I have no template which I can export so i cannot share any XML Files 🙂
... View more
07-11-2023
04:08 AM
yes you do 🙂 and you will point the JAR location to that entire folder and let NiFi do it's job. The JAR File (all of them) can be download directly from google: https://cloud.google.com/bigquery/docs/reference/odbc-jdbc-drivers There is only a single version for JDBC: SimbaJDBCDriverforGoogleBigQuery42_1.3.3.1004
... View more
07-10-2023
11:59 PM
@Sivaluxan, I am not quite sure you have the correct Database Driver Class Name. I am extracting data our of BigQuery using the combination of GenerateTableFetch and ExecuteSQLRecord and I receive no error message at all. In terms of configurations I have the following: Database Connection URL: jdbc:bigquery://https://www.googleapis.com/bigquery/v2:443;ProjectId=your-project-if-here;OAuthType=0;OAuthServiceAcctEmail=your-email-for-service-account-here;OAuthPvtKeyPath=path_on_nifi_server_where_the_service_account_json_is_located; Database Driver Class Name: com.simba.googlebigquery.jdbc.Driver Database Driver Location: full_path_to_jars_location/
... View more
07-10-2023
08:44 AM
@steven-matisonthanks for you answer:) You can download a template here: download Instead of GenerateFlowFile, I have another processing section, but nevertheless, the relevant part starts with AttributesToJson going up until the PutBigQuery Processors 🙂
... View more
07-10-2023
12:15 AM
@stevenmatison, @MattWho, @SAMSAL : have you ever encountered such a behavior? 😁
... View more
07-07-2023
07:05 AM
Hi guys, Please help me out with a strange behavior when using PutBigQuery. I am using Apache NiFi 1.19.1 So, my flow is as follows: Step a: I have a GenerateTableFetch and an ExecuteSQLRecord, which extracts some data out of Database. Step b: The data gets loaded into a GCS Bucket, using PutGCSOBject. Step c: When the data has been saved into the GCS Bucket, I have an UpdateAttribute Processor, linked to the success queue. Within this UpdateAttribute Processor, I have defined the following 3 attributes: TABLE_NAME = ${generatetablefetch.tableName:toUpper()} EXECUTION_DATE = ${now():toNumber()} MESSAGE = 1 Step d: The success queue is linked afterwards to an AttributesToJSON Processor. I have modified the properties as follows: Destination = flowfile-content Attributes List = TABLE_NAME, EXECUTION_DATE, MESSAGE Step e: Via success, I link to an ConvertRecord, where I change from JSON to AVRO. The JSON Reader and the AVRO Writter are both defined with the following schema: { "namespace": "example.avro", "type": "record", "name": "DOMAIN.LOGGING_STATUS_EXECUTION", "fields": [ { "name": "TABLE_NAME", "type": "string" }, { "name": "EXECUTION_DATE", "type": [ "null", { "type": "long", "logicalType": "local-timestamp-millis" } ] }, { "name": "MESSAGE", "type": "int" } ] } Step f: First test would be with PutBigQueryBatch. I have defined my Dataset, my Table Name, Load File Type = AVRO, Create Disposition = CREATE_IF_NEEDED and Write Disposition = WRITE_APPEND. When executing the processor on the AVRO File (from step e), the data gets loaded correctly into my BigQuery Table. My second test would be with PutBigQuery. I have defined my Dataset, my Table Name, the Record Reader as an AVRO Reader using the embedded AVRO Schema and Transfer Type = BATCH. When executing the processor on the AVRO File (from step e), the data gets loaded into my BigQuery Table, but all the values are NULL ... and no matter how much I wait, it remains NULL. Here is a screenshot of how the data looks, in the same table, where row 1 = PutBigQuery and row 2 = PutBigQueryBatch, using the same flow on the same data. The table has the following column-data types and it has not partitioning. TABLE_NAME = STRING EXECUTION_DATE = DATETIME MESSAGE = INTEGER Has anybody else experienced this behavior and if yes, how did you solve it? Thank you 🙂
... View more
Labels:
- Labels:
-
Apache NiFi
07-07-2023
12:23 AM
@Amrutham, How I would do it (and actually doing it in production): before inserting your data into your Database, you could add an UpdateRecord processor, where you define your new columns. In your UpdateRecord you will define the following: - a Record Reader: with default settings. - a Record Writer: no matter the format you will choose, you will have to define the schema of the file, containing the newly added columns. - a property named "/user": where you either extract the username from an attribute (assuming that you have it) or you write the desired username directly in the value field. - a property named "/insert_date": where you use NiFi's Expression Language to extract the run time in whatever format you would like. An example would be: ${now():toNumber():minus(86400000):format("yyyy-MM-dd HH:mm:ss.SSS", "Europe/Bucharest")} Now, make sure that: - the property name begins with "/", otherwise it will not work as you desire. - the Replacement Value Strategy is set to Literal Value.
... View more
07-05-2023
02:03 AM
1 Kudo
@c3turner7, For 1: that is actually not an error and you might ignore it. Does it get written with ERROR because as far as I know, that line is an INFO and it causes no issue/problem. For 2: based on my tests, you can ignore that message in windows as I received it constantly and NiFi works without any issue. For 3: Try recording your screen and see if any error gets printed in the CMD before being closed. When this happend to me, I had an issue with JAVA and it got printed in the logs or within the CMD window. Nevertheless, what CMDs are you trying to execute and why? They all have a purpose and if you start running them in blind you might cause other problems and make your debug much heavier than it already is. For 4: those are INFO lines, meaning that they are not affecting your application and you can ignore them. Now, circling back to your problem, what do you mean when saying that you do not pass the loading screen? Are you staring at a blank page or do you have the logo displayed constantly? A screenshot might help. In addition, I strongly suggest you to extract all the ERROR lines from your log (nifi-app.log and nifi-bootstrap.log) and paste them here. How did you configure nifi-properties and bootstrap.conf?
... View more
07-03-2023
11:51 PM
Well in this case, if everything works with the sample data, it means that there might be a problem with your data. I suggest you to compare the structure of the files (yours and the sample one) and see what are the differences. Maybe your files contains some invalid characters which eventually get false parsed by NiFi. Or your files contain to many lines and so on.
... View more