Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

PutSQL time format issue

avatar
Explorer

Hello all, 

i am new to Nifi , 

i have Processor to InvokeHTTP and get json file ,

the json format example:

"

{"alertDate":"2023-12-03 11:01:17","title":"XXXX","data":"XXXX, XXXX, XXXX","category":1}"

 "

i am using the Processor ConvertJSONToSQL  to send the data to PutSQL to insert to my sql data base ,

my issue is :

PutSQL Failed to update database for FlowFile due to The value of the sql.args.1.value is '2023-12-02 10:54:38', which cannot be converted to a timestamp"

 

know i need to use the JoltTransformJSON but not sure what is the correct Jolt Specification i need to put,

my DB columns  are like this 

ariel12_0-1701601600538.png

can you please help? - remember i am new to Nifi 🙂

thanks a lot!


 

2 ACCEPTED SOLUTIONS

avatar
Super Guru

Hi @ariel12 ,

This is happening because the PutSQL expects the timestamp (if no format is provided) to be in the format of "yyyy-MM-dd HH:mm:ss.SSS" . That is the format passed to the Java timestamp constructor by default when trying to parse the date string. Since the date in your input json is missing the milliseconds ".SSS" and no format is provided the process will fail with the "Unparseable date" error.

To fix this,  you can use different processors as follows:

1- UpdateAttribute

You need to find what sql.args.N.value attribute generated by the ConvertJsonToSQL processor that corresponds to the value of the alertDate, then you can update this value using to UpdateAttribute to the proper expected format by PUTSql. For example, if you find that the alertDate value is stored in sql.args.1.value then , then you can update it using Expression Language to the following:

${sql.args.1.value:toDate("yyyy-MM-dd HH:mm:ss"):format("yyyy-MM-dd HH:mm:ss.SSS")}

And the UpdateAttribute will look like:

SAMSAL_0-1701632318125.png

Alternatively:  Instead of updating the value, you can update the type for the sql.args.N.type to set it to be as varchar ( value of 12) instead of timestamp (value of 93), this will fix the problem too. Also you can assign format using sql.args.N.format -not provided by default - where you set the format value to "yyyy-MM-dd HH:mm:ss" , the PutSQL will respect the provided format and use it.

 

2- UpdateRecord:

If you dont like purging default data , or figuring out which sql args the timestamp is getting stored into specially when you have large data set and many timestamp fields where it becomes challenging to do or figure out index based on changing order, Instead you want to do it on the source data before it gets converted to SQL, then you can use UpdateRecord where you have to provide Json reader\writer and add dynamic property that sets the path to the target field and set it accordingly. In your case the UpdateRecod will look like this:

SAMSAL_2-1701633132236.png

The Value field has the following:

${field.value:toDate("yyyy-MM-dd HH:mm:ss"):format("yyyy-MM-dd HH:mm:ss.SSS")}

Note: the field.value in the expression is provided by this processor to reference the original value of the given path. please refer to UpdateRecod documentation under Additional Details.

3- Jolt Transformation:

Since you asked about using Jolt, you can use it as well to set the data in the source. Since Jolt doesnt have datetime functions you just use string concat to add default value for the .SSS like .000. here is the spec :

[
  {
    "operation": "modify-overwrite-beta",
    "spec": {
      "alertDate": "=concat(@(1,alertDate),'.000')"
    }
  }
]

If that helps please accept solution.

Thanks

 

 

 

View solution in original post

avatar
Super Guru

@ariel12,

If I understood correctly you need to see if the record exist in the DB based on certain unique value\s before inserting it to prevent duplicates. If that is the case, then you can utilize processor called LookupRecord that can lookup record from the different sources including DB and then route the result to match or unmatched where you can decide the proper action.

To see an example on how you can set up the lookupRecord and the lookupRecordService you can watch the following video:

https://www.youtube.com/watch?v=17KG3kKQkOs

More info about RecordLookup:

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.23.2/org.apach...

I Hope that will help.

Thanks

 

View solution in original post

5 REPLIES 5

avatar
Super Guru

Hi @ariel12 ,

This is happening because the PutSQL expects the timestamp (if no format is provided) to be in the format of "yyyy-MM-dd HH:mm:ss.SSS" . That is the format passed to the Java timestamp constructor by default when trying to parse the date string. Since the date in your input json is missing the milliseconds ".SSS" and no format is provided the process will fail with the "Unparseable date" error.

To fix this,  you can use different processors as follows:

1- UpdateAttribute

You need to find what sql.args.N.value attribute generated by the ConvertJsonToSQL processor that corresponds to the value of the alertDate, then you can update this value using to UpdateAttribute to the proper expected format by PUTSql. For example, if you find that the alertDate value is stored in sql.args.1.value then , then you can update it using Expression Language to the following:

${sql.args.1.value:toDate("yyyy-MM-dd HH:mm:ss"):format("yyyy-MM-dd HH:mm:ss.SSS")}

And the UpdateAttribute will look like:

SAMSAL_0-1701632318125.png

Alternatively:  Instead of updating the value, you can update the type for the sql.args.N.type to set it to be as varchar ( value of 12) instead of timestamp (value of 93), this will fix the problem too. Also you can assign format using sql.args.N.format -not provided by default - where you set the format value to "yyyy-MM-dd HH:mm:ss" , the PutSQL will respect the provided format and use it.

 

2- UpdateRecord:

If you dont like purging default data , or figuring out which sql args the timestamp is getting stored into specially when you have large data set and many timestamp fields where it becomes challenging to do or figure out index based on changing order, Instead you want to do it on the source data before it gets converted to SQL, then you can use UpdateRecord where you have to provide Json reader\writer and add dynamic property that sets the path to the target field and set it accordingly. In your case the UpdateRecod will look like this:

SAMSAL_2-1701633132236.png

The Value field has the following:

${field.value:toDate("yyyy-MM-dd HH:mm:ss"):format("yyyy-MM-dd HH:mm:ss.SSS")}

Note: the field.value in the expression is provided by this processor to reference the original value of the given path. please refer to UpdateRecod documentation under Additional Details.

3- Jolt Transformation:

Since you asked about using Jolt, you can use it as well to set the data in the source. Since Jolt doesnt have datetime functions you just use string concat to add default value for the .SSS like .000. here is the spec :

[
  {
    "operation": "modify-overwrite-beta",
    "spec": {
      "alertDate": "=concat(@(1,alertDate),'.000')"
    }
  }
]

If that helps please accept solution.

Thanks

 

 

 

avatar
Explorer

Thank you very much for your help and support   @SAMSAL ,this defiantly did the trick , its working perfect ! (now i have new issue that i didn't think  about , since i am pulling the json file every X time the insert to the DB contain the same value with the same time value. i need to find a way to  put filter or something to prevent multi insert with the same values 

thanks a lot ! 

avatar
Super Guru

@ariel12,

If I understood correctly you need to see if the record exist in the DB based on certain unique value\s before inserting it to prevent duplicates. If that is the case, then you can utilize processor called LookupRecord that can lookup record from the different sources including DB and then route the result to match or unmatched where you can decide the proper action.

To see an example on how you can set up the lookupRecord and the lookupRecordService you can watch the following video:

https://www.youtube.com/watch?v=17KG3kKQkOs

More info about RecordLookup:

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.23.2/org.apach...

I Hope that will help.

Thanks

 

avatar
Explorer

Hello @SAMSAL ,

sorry for the late response,

my issue is the the data in json is always visible ,i need to insert to the db only the data that was not insert the last run (the task run every 5sec) ,

so now every 5 sec  the task insert all the json data so i have many rows with the same values , 

my end goal  is to every 5 sec to insert only data that is not already in the db from the json 

sorry for the confusion 

avatar
Explorer

@SAMSAL very much , thank you!