Member since
07-07-2022
41
Posts
2
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
73 | 07-27-2022 12:05 PM | |
88 | 07-25-2022 05:01 AM | |
44 | 07-21-2022 02:03 AM | |
43 | 07-21-2022 01:52 AM | |
81 | 07-19-2022 10:48 AM |
08-04-2022
03:19 AM
@araujo Changing a datatype of source DB in production environment will not be a good solution. Is there any other approach to handle Mysql JSON datatype in NiFi ready from Mysql binarylog? Thanks,
... View more
08-03-2022
10:39 PM
Hi, I connected the two mysql server and tried to both server DB. Both server data were received by both server captureChangeMySql processor. one server data was send to destination DB by EnforceOrder processor but it is giving error for second server flow file data. Can anyone suggest how to resolve this issue. Error Log: 2022-08-04 10:19:01,530 WARN [Timer-Driven Process Thread-3] o.a.n.processors.standard.EnforceOrder EnforceOrder[id=9c7bbf3f-df40-3d1a-6cc0-087c9d2052ce] Skipped, FlowFile order was 10 but current target is 897. StandardFlowFileRecord[uuid=ac8e8594-17b8-4e2a-872b-44af8dfea43b,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1659588192914-1, container=default, section=1], offset=3466, length=384],offset=0,name=ac8e8594-17b8-4e2a-872b-44af8dfea43b,size=384]
... View more
Labels:
- Labels:
-
Apache NiFi
08-03-2022
04:15 AM
Hi, I want to connect connect two different MySQL database services(mysql 5.7 and Mysql 8.0 ). I am using two two CaptureChangeMySQL Processors for this. Mysql 5.7 is install in my local machine and CaptureChangeMySQL Processors connecting successfully. Mysql 8.0 is install on docker container and on host machine port 3310 like (sudo docker run --name mysql-8.0-container -p 127.0.0.1:3310:3306 -e MYSQL_ROOT_PASSWORD=password -d mysql:8.0). CaptureChangeMySQL connecting with mysql 8.0 is giving error. Can any suggest why i am getting this error and how to resolve it. Note: I am able to connect mysql 8.0 using mysql workbench. Error: 2022-08-03 15:59:58,863 ERROR [Timer-Driven Process Thread-4] o.a.n.c.m.processors.CaptureChangeMySQL CaptureChangeMySQL[id=352daba9-0182-1000-a02a-faf160ab1329] Processing failed org.apache.nifi.processor.exception.ProcessException: Could not connect binlog client to any of the specified hosts due to: Client does not support authentication protocol requested by server; consider upgrading MySQL client at org.apache.nifi.cdc.mysql.processors.CaptureChangeMySQL.setup(CaptureChangeMySQL.java:664) at org.apache.nifi.cdc.mysql.processors.CaptureChangeMySQL.onTrigger(CaptureChangeMySQL.java:678) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1283) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:214) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:103) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) Thanks,
... View more
Labels:
- Labels:
-
Apache NiFi
07-28-2022
12:05 AM
Hi , I want to move source DB data to destination DB. Source db table has 'metadat' column, dataType='JSON'. I am using 'captureChangeMySql' processor to ready the change data from mysql binary-log file and put a data to destination db. 'captureChangeMySql' processor to reading the change data from mysql binary-log file but 'metadat' column, dataType='JSON' , value is showing as some encoded format. Can anyone suggest, how to decode, encoded data read by 'captureChangeMySql' processor flowfile output using any other NiFi processor? Original data in db: {"mappedProducts": ["20-4040", "20-4041", "20-4042", "20-4043"]} Flowfile output: [ { "id" : 18, "name" : "Transitional Licenses - Provider", "external_id" : "101", "product_tag" : null, "tenant_id" : 1, "metadata" : "\u0000\u0001\u0000I\u0000\u000B\u0000\u000E\u0000\u0002\u0019\u0000mappedProducts\u0004\u00000\u0000\f\u0010\u0000\f\u0018\u0000\f \u0000\f(\u0000\u000720-4040\u000720-4041\u000720-4042\u000720-4043", "duration_in_days" : null, "offline_completion_duration" : null, "product_code" : "cps_provider_7_ed", "external_code" : null, "external_url" : null, "short_description" : "This course is for Providers", "long_description" : "This course is for Providers", "meta_keyword" : null, "meta_description" : null, "meta_title" : null, "status" : "Active", "event_type" : "ESSENTIALS", "event_participation_type" : "REGISTER_ILE", "terms" : null, "product_type_id" : 1, "is_perpetual" : null, "sort_order" : "20", "transitional_sort_order" : null, "completion_cert_type" : null, "ecard_cert_type" : null, "ecard_validity" : null, "ecard_name" : null, "is_deleted" : "{}", "created_at" : "Thu Mar 04 13:15:53 IST 2021", "updated_at" : "Thu Mar 04 13:15:53 IST 2021", "created_by" : null, "updated_by" : null, "is_legacy" : "{0}", "has_ce" : "{}", "region" : "CA", "self_registration_override" : "{}", "allow_assignment" : "{0}", "default_license_type" : null, "dispatch_confirmation" : "{}", "multi_package" : "{}" } ]
... View more
Labels:
- Labels:
-
Apache NiFi
07-27-2022
12:05 PM
@cnelson2 Are you specifying the database to use in your DBCP connection pool service? -> yes I solve this issue. I have given Catalog Name : destinationDBName and Make empty schema Name value in putdatabase record processor. Now I am able to perform insert , update and delete action source to destination 'user' table. Thanks,
... View more
07-27-2022
06:59 AM
Hi, I am getting error "Record does not have a value for the PrimaryKey column 'HOST' " when I update a data in source db table. This issue is happening when destination DB table's name: 'user' or 'users'. It is working fine when destination table's name is change to 'report_user' (I mean when we change the table name 'user'or 'users' to something else which is not used as mysql internal table) I am not sure why it is trying to insert or update on internal mysql table as I am giving the destination DBName and tableName. Can anyone suggest how can we fix this issue without changing the destination table name? Error Log: PutDatabaseRecord[id=ff8ca68c-8252-3652-8f55-1044ad3f1bab] Failed to put Records to database for StandardFlowFileRecord[uuid=43e64992-a3d0-4a75-bb85-8d70c1f556d7,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1658747398462-4, container=default, section=4], offset=20549, length=1214],offset=0,name=43e64992-a3d0-4a75-bb85-8d70c1f556d7,size=1214]. Routing to failure. org.apache.nifi.serialization.MalformedRecordException: Record does not have a value for the PrimaryKey column 'HOST' at org.apache.nifi.processors.standard.PutDatabaseRecord.normalizeKeyColumnNamesAndCheckForValues(PutDatabaseRecord.java:1367) PutDatabaseRecord processor flow file output: [ { "id" : 1, "first_name" : "john", "middle_name" : "s_test", "last_name" : "asjndjnas", "user_name" : "deleted_8AB-0034", "is_terms_accepted" : "true", "is_retail" : "true", "tenant_id" : 1, "email" : "sreejith.karonnan2@mailinator.com", "alternate_email" : null, "gender" : "M", "dob" : "2008-01-01", "sso_id" : null, "phone_number" : null, "status_id" : 2, "time_zone_id" : 1, "is_activated" : "true", "activated_at" : "2021-01-07 16:29:25", "is_deleted" : "false", "import_id" : null, "created_at" : "2021-01-07 16:29:25", "updated_at" : "2021-11-16 18:38:42", "created_by" : 123, "updated_by" : 3, "last_login_date" : null, "agreement_acceptance_date" : "2021-03-22 20:44:23", "user_ref_id" : "589RngS1", "is_admin" : "false", "is_distributor" : "true", "is_self_register" : "false", "ce_job_id" : null, "region" : "US", "recovery_status" : null, "hlc_user_name" : null, "allow_assignment" : false, "has_ce" : false, "product_launched_at" : null, "is_perpetual" : false, "learning_completed_at" : null, "enrolled_at" : null, "dispatch_confirmation" : false, "subscription_end_at" : null, "ecard_date" : null, "offline_completion_date" : null, "subscription_start_at" : null, "product_completed_at" : null, "online_completion_date" : null, "is_legacy" : false, "last_activity_at" : null, "multi_package" : false, "metadata" : null, "self_registration_override" : false } ] Thanks,
... View more
Labels:
- Labels:
-
Apache NiFi
07-26-2022
10:17 PM
Hi, I have install a NiFi setup and deign the flow file to move data from source mysql db to destination mysql db. Now I want to know how can I setup it in dev or QA environments. I was going to NiFi admin guide and came to know about NiFi Registry and NiFi toolkit feature. I have below doubts: 1. Is it mandatory to install NiFi toolkit along with NiFi to configure clustering and managing the cluster node? or without installing a NiFi toolkit, we can configure cluster and manage it? 2. Is it mandatory to install NiFi Registry along with NiFi to manage the flowfile change restriction?
... View more
Labels:
- Labels:
-
Apache NiFi
-
NiFi Registry
07-25-2022
05:01 AM
@ckumar Are you suggesting to create two flow-file templates with respect to MySQL database service? Or This can be achieve by just using the the two CaptureChangeMySQL Processors in one template which will have the configuration with respect to MySQL database service. Attaching the current flow-file screenshot and also as per you suggestion with two captureChangeMySql processor. Please confirm If my understanding is correct as per you suggestion
... View more
07-25-2022
12:22 AM
Hi, Query 1: How to connect to two different source DB instants using captureChangeMySql processor?. Query 2: is it possible to configure two different source DB instants using captureChangeMySql processor to get data from both db instance and put it in destination DB?, If yes, Please suggest the configuration approach. Below is my current configuration screenshot to read the data from single instant connecting two source DB.
... View more
Labels:
- Labels:
-
Apache NiFi
07-22-2022
09:24 PM
Hi , How to define JSON datatype for any Attribute in Avro schema Registry? Thanks Abhishek Singh
... View more
Labels:
- Labels:
-
Apache NiFi
07-22-2022
09:02 AM
Hi , I want to move source DB data to destination DB. Source db table has 'metadat' column, dataType='JSON'. I am using 'captureChangeMySql' processor to ready the change data from mysql binary-log file and put a data to destination db. 'captureChangeMySql' processor to reading the change data from mysql binary-log file but 'metadat' column, dataType='JSON' , value is showing as some encoded format. 1. Can anyone suggest why JSON datatype value is coming encoded like below? 2. Which format is it encoded and how can we get it back as a JSON data to save at destination db. Flowfile output: [ { "id" : 18, "name" : "Transitional Licenses - Provider", "external_id" : "101", "product_tag" : null, "tenant_id" : 1, "metadata" : "\u0000\u0001\u0000I\u0000\u000B\u0000\u000E\u0000\u0002\u0019\u0000mappedProducts\u0004\u00000\u0000\f\u0010\u0000\f\u0018\u0000\f \u0000\f(\u0000\u000720-4040\u000720-4041\u000720-4042\u000720-4043", "duration_in_days" : null, "offline_completion_duration" : null, "product_code" : "cps_provider_7_ed", "external_code" : null, "external_url" : null, "short_description" : "This course is for Providers", "long_description" : "This course is for Providers", "meta_keyword" : null, "meta_description" : null, "meta_title" : null, "status" : "Active", "event_type" : "ESSENTIALS", "event_participation_type" : "REGISTER_ILE", "terms" : null, "product_type_id" : 1, "is_perpetual" : null, "sort_order" : "20", "transitional_sort_order" : null, "completion_cert_type" : null, "ecard_cert_type" : null, "ecard_validity" : null, "ecard_name" : null, "is_deleted" : "{}", "created_at" : "Thu Mar 04 13:15:53 IST 2021", "updated_at" : "Thu Mar 04 13:15:53 IST 2021", "created_by" : null, "updated_by" : null, "is_legacy" : "{0}", "has_ce" : "{}", "region" : "CA", "self_registration_override" : "{}", "allow_assignment" : "{0}", "default_license_type" : null, "dispatch_confirmation" : "{}", "multi_package" : "{}" } ] Original data in db: {"mappedProducts": ["20-4040", "20-4041", "20-4042", "20-4043"]}
... View more
Labels:
- Labels:
-
Apache NiFi
07-21-2022
01:52 AM
1 Kudo
After enabling the GTID of mysql service and enabled captureChange mySql processor configuration to " Use Binlog GTID" : true, This issue got resolved. below script to enabled GTID in mysql: SET @@GLOBAL.ENFORCE_GTID_CONSISTENCY = WARN; SET @@GLOBAL.ENFORCE_GTID_CONSISTENCY = ON; SET @@GLOBAL.GTID_MODE = OFF_PERMISSIVE; SET @@GLOBAL.GTID_MODE = ON_PERMISSIVE; After enabling above properties run below query to check status. SHOW STATUS LIKE 'ONGOING_ANONYMOUS_TRANSACTION_COUNT'; If count is 0 then enabled GTID using below query: SET @@GLOBAL.GTID_MODE = ON; To Check the GTID status: select @@GLOBAL.GTID_MODE; select @@GLOBAL.ENFORCE_GTID_CONSISTENCY;
... View more
07-20-2022
04:26 AM
Hi, I want to modify the ddl query table name. I am using the replaceText processor to modify the flowfile query table name. Currently I am able to achieve it because I have one source DB and only one table name I have changed in destination DB. Like sourceDBName: nrpsubscriotion and tableName: status -> destinationDBName: nrpresport and tableName: subscription_status. Now I have two source DB(1. nrpsubscriotion, 2. nrpuserorg) and both have same tableName: status. Now Source DB 2 , tableName in destination DB nrpreport has tableName: user_org_status. I have stuck here , I am not getting how can define a condition to replce a tableName in DDL based on schema name and ddl query able name: It should map like below: 1. SourceDBName: nrpsubscriotion and tableName: status -> destinationDBName: nrpresport and tableName: subscription_status. Flow File input: { "type" : "ddl", "timestamp" : 1658314685000, "binlog_gtidset" : "e75d07af-eb37-11ec-9d1d-a86daa745b08:1-206", "database" : "nrpsubscription", "table_name" : null, "table_id" : null, "query" : "ALTER TABLE `status` ADD COLUMN `is_test` VARCHAR(255) NULL AFTER `code`" } Expected output: { "type" : "ddl", "timestamp" : 1658314685000, "binlog_gtidset" : "e75d07af-eb37-11ec-9d1d-a86daa745b08:1-206", "database" : "nrpsubscription", "table_name" : null, "table_id" : null, "query" : "ALTER TABLE `subscription_status` ADD COLUMN `is_test` VARCHAR(255) NULL AFTER `code`" } 2. SourceDBName: nrpuserorg and tableName: status -> destinationDBName: nrpreport and tableName: user_org_status. { "type" : "ddl", "timestamp" : 1658314686000, "binlog_gtidset" : "e75d07af-eb37-11ec-9d1d-a86daa745b08:1-207", "database" : "nrpuserorg", "table_name" : null, "table_id" : null, "query" : "ALTER TABLE `status` ADD COLUMN `is_test` VARCHAR(255) NULL AFTER `code`" } Expected output: { "type" : "ddl", "timestamp" : 1658314686000, "binlog_gtidset" : "e75d07af-eb37-11ec-9d1d-a86daa745b08:1-207", "database" : "nrpuserorgdb", "table_name" : null, "table_id" : null, "query" : "ALTER TABLE `user_org_status` ADD COLUMN `is_test` VARCHAR(255) NULL AFTER `code`" }
... View more
Labels:
- Labels:
-
Apache NiFi
07-19-2022
10:48 AM
Hi , I have modified Sql statement using replace text processor but still getting below error when I alter a table and add new column. PutDatabaseRecord processor input json: { "type" : "ddl", "timestamp" : 1658252209000, "binlog_gtidset" : "e75d07af-eb37-11ec-9d1d-a86daa745b08:1-147", "database" : "nrpsubscriptiondb2306", "table_name" : null, "table_id" : null, "query" : "ALTER TABLE `nrpreportdb`.`user_org_status` ADD COLUMN `is_test` VARCHAR(255) NULL AFTER `code`" } Error log: Record had no (or null) value for Field Containing SQL: query, FlowFile StandardFlowFileRecord[uuid=bc7d51b3-e71f-4eb2-834b-fecbcff572be,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1658226664320-3, container=default, section=3], offset=37727, length=277],offset=0,name=bc7d51b3-e71f-4eb2-834b-fecbcff572be,size=277]
... View more
07-19-2022
10:32 AM
@rafy Above link is saying that to use one more processor PutSQL before PutDatabaseRecord to execute a statement . We already have PutDatabaseRecord processor to execute statement on db then why we should add putSQL processor? Also other than DDL if any other action(insert, update, delete) will be perform in source db, It will fail because putSql processor will expect sql statement to execute first and based on success next processor will be call. In this case provided link suggestion will not work.
... View more
07-19-2022
05:35 AM
Hi, I want to manipulate below string in updateAttribute processor. I tried replace function etc but it is not working for me. Can anyone suggest to modify below string to change DBName and TableName as per expected output. query = ${query:replace('nrpuserorgdb2306.status', 'nrpreportdb.user_org_status')} query : "ALTER TABLE `nrpuserorgdb2306`.`status` \nADD COLUMN `is_deleted` TINYINT NULL AFTER `code`" Expected output: query : ALTER TABLE `nrpreportdb`.`user_org_status` ADD COLUMN `is_test` TINYINT NULL AFTER `code`
... View more
- Tags:
- NiFi
- string value
Labels:
- Labels:
-
Apache NiFi
07-18-2022
12:52 AM
I have configured the NiFi flow file where I am reading mysql binary-log file. for me insert, update and delete flow is working fine but when I alter a table then I am getting error. sourceDBName: nrpsubscriptiondb2306 and tableName: product_types destinationDBName: nrpreportdb and tableName: product_types Alter table nifi error: Record had no (or null) value for Field Containing SQL: query, FlowFile StandardFlowFileRecord[uuid=8ca029c9-70aa-4e53-aec3-d37c15c8cd07,claim=StandardContentClaim [resourceClaim=StandardResourceClaim
... View more
Labels:
- Labels:
-
Apache NiFi
07-17-2022
11:32 PM
@alim, I have implemented the same flow file where I am ready mysql binary-log file. for me insert, update and delete flow is working fine but when I alter a table then I am getting error. sourceDBName: nrpsubscriptiondb2306 and tableName: product_types destinationDBName: nrpreportdb and tableName: product_types Alter table nifi error: Record had no (or null) value for Field Containing SQL: query, FlowFile StandardFlowFileRecord[uuid=8ca029c9-70aa-4e53-aec3-d37c15c8cd07,claim=StandardContentClaim [resourceClaim=StandardResourceClaim
... View more
07-17-2022
11:07 PM
@alim , I am using the captureChangeMySql processor to read the mysql binary-log file. This flow work fine for me when I configure the processor flow freshly but when I stop all the processor and start it gain then I am getting below error. Please suggest if you have any idea about the error issue. 2022-07-13 20:17:16,524 ERROR [Timer-Driven Process Thread-2] o.a.n.c.m.processors.CaptureChangeMySQL CaptureChangeMySQL[id=f5df756e-0181-1000-c210-1ab5c779f675] Processing failed org.apache.nifi.processor.exception.ProcessException: java.io.IOException: COMMIT event received while not processing a transaction (i.e. no corresponding BEGIN event). This could indicate that your binlog position is invalid. at org.apache.nifi.cdc.mysql.processors.CaptureChangeMySQL.onTrigger(CaptureChangeMySQL.java:721) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1283) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:214) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:103) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: java.io.IOException: COMMIT event received while not processing a transaction (i.e. no corresponding BEGIN event). This could indicate that your binlog position is invalid. at org.apache.nifi.cdc.mysql.processors.CaptureChangeMySQL.outputEvents(CaptureChangeMySQL.java:992) at org.apache.nifi.cdc.mysql.processors.CaptureChangeMySQL.onTrigger(CaptureChangeMySQL.java:705) ... 10 common frames omitted 2022-07-13 20:17:16,740 INFO [blc-127.0.0.1:3306] c.g.shyiko.mysql.binlog.BinaryLogClient Connected to 127.0.0.1:3306 at mysql-bin.000025/12330 (sid:65535, cid:6)
... View more
07-17-2022
11:15 AM
Hi, I want to create and validate a schema registry dynamically. Can anyone suggest a right approach to do it. Attaching my Nifi screenshot for reference .
... View more
Labels:
- Labels:
-
Apache NiFi
07-17-2022
11:04 AM
Hi, how can I alter the table using NIFI Processor where source dbname : org, tableName:status, destination dbName:report and tableName: orgStatus. I am using captureChangeMySql Processor to read the binary-bin log file. Below json i am getting in updateAtribute processor: {"type":"ddl","timestamp":1658078637000,"binlog_filename":"mysql-bin.000030","binlog_position":2727,"database":"nrpsubscriptiondb2306","table_name":null,"table_id":null,"query":"ALTER TABLE `nrpuserorgdb2306`.`status` \nADD COLUMN `is_deleted` TINYINT NULL AFTER `code`"} Please find my processor screenshot for this flow.
... View more
Labels:
- Labels:
-
Apache NiFi
07-14-2022
12:44 AM
Hi, How can i define the Initial Binlog Position in captureChangeMySql processor configuration in expression language to read the binlog greater than or equal given number.
... View more
Labels:
- Labels:
-
Apache NiFi
07-13-2022
09:47 AM
Can anyone suggest how to fix above issue. This issue come when i stop the all processor and again start the all processor then I can see the error at CaptureChangeMySql processor level error. 2022-07-07 15:10:58,307 ERROR [Timer-Driven Process Thread-8] o.a.n.c.m.processors.CaptureChangeMySQL CaptureChangeMySQL[id=d7f0536f-0181-1000-db94-5950dddc0439] Processing failed org.apache.nifi.processor.exception.ProcessException: java.io.IOException: COMMIT event received while not processing a transaction (i.e. no corresponding BEGIN event). This could indicate that your binlog position is invalid. at Note: This issue get resolved when i just delete the existing CaptureChangeMySQL processor and again create and configure the CaptureChangeMySQL processor.
... View more
07-13-2022
09:41 AM
@araujo Can you please suggest how I can fix above issue. CaptureChangeMySql processor is giving value "{}" for BIT datatype. Processor out payload: { "type": "update", "timestamp": 1657194900000, "binlog_filename": "mysql-bin.000020", "binlog_position": 37029, "database": "nrpsubscriptiondb2306", "table_name": "user_subscription", "table_id": 137, "columns": [ { "id": 29, "name": "is_deleted", "column_type": -7, "last_value": "{}", "value": "{}" }, ] }
... View more
07-12-2022
10:35 PM
@araujo I want to convert it to DateTime format like: 2021-10-17 15:58:58 not in timstamp. is there possible here to convert it to datetime formate like yyyy-MM-dd hh:mm:ss ?
... View more
07-12-2022
10:19 AM
Hi, I want to convert some of the field value like "enrolled_at":"Sun Oct 17 15:58:58 IST 2021" -> DateTime(2021-10-17 15:58:58) and "is_deleted":"{0}" -> boolean(true or false) from below Json array to save converted data in db. How i can achieve it in NiFi. [{"id":234521,"tenant_id":1,"product_id":1,"version":"3","user_id":197172,"user_ref_id":"8TS-0516","org_id":910,"org_subscription_id":7294,"access_code":null,"order_ref_id":null,"access_code_id":null,"enrol_type":"Admin4","enrolled_at":"Sun Oct 17 15:58:58 IST 2021","subscription_start_at":"Wed Oct 13 15:58:58 IST 2021","subscription_end_at":"Sat Oct 23 15:58:58 IST 2021","product_launched_at":"Wed Oct 13 16:05:49 IST 2021","product_completed_at":"Wed Oct 13 16:15:44 IST 2021","learning_completed_at":"Wed Oct 13 16:15:44 IST 2021","subscription_status":"ACTIVE","learning_status":"COMPLETED","total_credits":null,"completion_certificate_id":3585,"ecard_id":35867,"certificate_number":"tocubg3sfbkzfgfqslnisnci","ecard_number":"1hr5w9jkpikonazrfhqquauh","online_completion_date":"Wed Oct 13 16:08:07 IST 2021","offline_completion_date":"Wed Oct 13 16:15:44 IST 2021","ecard_date":"Tue Oct 31 16:15:44 IST 2023","is_deleted":"{0}","created_by":1,"updated_by":14302,"org_Name":"International Society","assignment_id":961,"last_activity_at":"Wed Oct 13 16:15:44 IST 2021","updated_at":"Wed Oct 13 16:15:44 IST 2021","source_lms":null,"course_instance_id":"20-3583"}]
... View more
Labels:
- Labels:
-
Apache NiFi
07-11-2022
11:06 AM
I tried again below expressions: 1. $.{enrolled_at:replaceAll('(IST)\s',''):toDate('EEE MMM dd HH:mm:ss yyyy'):format('yyyy-MM-dd HH:mm:ss')} // got error {enrolled_at:replaceAll does not exist. 2. ${enrolled_at:replaceAll('(IST)\s',''):toDate('EEE MMM dd HH:mm:ss yyyy'):format('yyyy-MM-dd HH:mm:ss')} // expression validation failling
... View more
07-11-2022
10:18 AM
I tried above conversion but I am getting below error. AvroSchema: { "name": "enrolled_at", "type": { "type":"int", "logicalType":"date"} }, ErrorLog: org.apache.nifi.serialization.MalformedRecordException: Successfully parsed a JSON object from input but failed to convert into a Record object with the given schema at org.apache.nifi.json.AbstractJsonRowRecordReader.nextRecord(AbstractJsonRowRecordReader.java:162) at org.apache.nifi.serialization.RecordReader.nextRecord(RecordReader.java:50) at jdk.internal.reflect.GeneratedMethodAccessor653.invoke(Unknown Source) Caused by: com.jayway.jsonpath.InvalidPathException: Function with name: {literal does not exist.
... View more
07-11-2022
08:36 AM
I have got solution for this.
... View more