Member since
11-16-2015
898
Posts
659
Kudos Received
248
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
526 | 06-26-2025 01:21 PM | |
388 | 06-19-2025 02:48 PM | |
612 | 05-30-2025 01:53 PM | |
9444 | 02-22-2024 12:38 PM | |
2017 | 02-02-2023 07:07 AM |
03-24-2020
05:23 AM
Hello mburgess, I am able to extract attributes from using GetFile>SplitXml> EvaluateXQuery processor. Now, can you please tell me how can I store these dynamic attributes to the PostgreSQL database using PutSQL processor ? Is this the correct expression I am putting in PutSQL? INSERT INTO alstom_radioscopy_amsterdam_blue."xml_log"(block_id,kp_begin,kp_end) VALUES (${block_id.text()},${kp_begin.text()},${kp_end.text()});
... View more
03-08-2020
09:45 AM
Hi @Kart,
As this is a thread which was marked 'Solved' over three years ago, you would have a better chance of receiving a resolution by posting a new question. This will also present you with the opportunity to include details specific to your environment that could aid other members in providing a more relevant answer to your question.
... View more
03-03-2020
01:25 PM
1 Kudo
@asfou NiFi does not contain any processors that support Hive version 2.x. The latest versions of Apache NiFi offer Hive 1.x and Hive 3.x client based processor components. To support Hive 2.x version, you may need to build your own custom processors built using the Hive 2.x client. Matt
... View more
02-18-2020
08:03 AM
Hi All,
I am facing issue still after adding below both hive-site.xml and core-site.xml.
<property> <name>hadoop.security.authentication</name> <value>kerberos</value> </property>
I am facing below error
org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL jdbc:hive2://ux329tas101.ux.hostname.net:10000/default;principal=<principal name>;ssl=true
Could you please help me regarding this.
Regards,
Swadesh Mondal
... View more
02-14-2020
05:55 AM
For small XML files, you could use ExtractText to get the entire content into an attribute, then UpdateRecord with an XMLReader, adding a property for your new field (let's say "/content") and whatever writer you wish. You will have to specify the output schema for the writer, to include all the fields parsed by the XMLReader in addition to a CLOB/BLOB/String "content" field. If you want to exclude fields from the XML then just exclude them from the output schema. Then you can use a similar Reader (AvroReader if you use an AvroRecordSetWriter, e.g.) in PutDatabaseRecord. If this doesn't work for your use case, you may need to use SplitXml and work on individual records. This will degrade the performance of PutDatabaseRecord unless you merge the records back together later (using MergeRecord for example).
... View more
02-11-2020
06:00 AM
2 Kudos
That error indicates that your CSVReader is not treating the first line as a header, and thus even if you specify an explicit schema, if you don't set Treat First Line As Header to true, the reader will think the header line is a data line, and when it tries to parse it as data (numbers, e.g.) it will fail.
... View more
02-11-2020
04:07 AM
Hello @mburgess , If you have an input string but don't know what the value might be, e.g. it could be "8" or "8.4", and want to convert this into an int. Is there a way you can convert the "8" to an 8 and the "8.4" to an 8.4 float? Currently I am only able to convert both to ints as 8, or both to floats as 8.0, and 8.4. For context, I am using a ValidateRecord to validate the number is an int, so would not like float values to be validated. This means that if an input is converted from a string into a number, I would like to know whether it is a decimal or integer. Are you able to please assist? Many thanks!
... View more
02-04-2020
05:10 AM
Hi @mburgess I tried the solution but it is not working for other JSON paths like $.key I tried multiple JSON paths but seems like it only works for $.cells.year. Please let me know your thoughts. I followed the exact steps you provided.
... View more
01-28-2020
02:49 AM
1 Kudo
@mburgess Matt, thank you so much for your quick response and help. It took me some time to figure out what you meant... but it works like a charm! If anyone can use the solution, here it is: //======================================================================================================
// TEST java LocalDate.parse with groovy max-function
// FF-Attribute RESPONSE contains [{"id":"(1208)", "datbis":"20180219" }, { "id":"(1210)", "datbis":"20191231" }, { "id":"(1212)", "datbis":"20200128" }]
// FF-Atribute MAX_datbis returns 20200128
//======================================================================================================
import java.time.LocalDate
def flowFile = session.get()
if(!flowFile) return
try {
def objList = new groovy.json.JsonSlurper().parseText(flowFile.getAttribute('RESPONSE'))
def max = objList.max {LocalDate.parse(it.datbis,"yyyyMMdd")}
flowFile = session.putAttribute(flowFile, 'MAX_datbis', max.datbis.toString())
session.transfer(flowFile, REL_SUCCESS)
} catch(e) {
log.error("Error while determining max", e)
session.transfer(flowFile, REL_FAILURE)
}
... View more