Member since
06-08-2017
1049
Posts
518
Kudos Received
312
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 11124 | 04-15-2020 05:01 PM | |
| 7020 | 10-15-2019 08:12 PM | |
| 3066 | 10-12-2019 08:29 PM | |
| 11247 | 09-21-2019 10:04 AM | |
| 4190 | 09-19-2019 07:11 AM |
11-27-2018
03:37 AM
@Julio Gazeta I don't think NiFi won't store the reference once we clear off the states in the processor. In your "d:\\tmp\\input" directory have only one file then clear off all states in ListFile processor then 1.start the processor once and then stop the processor and 2.start the ListFile processor again then you are going to list the file from the directory. - If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of issues.
... View more
11-19-2018
09:27 PM
@Jacob Paul I believe your flowfiles having source-date1 attribute with value 20181119112100. Then change your update attribute property values as source-date as ${source-date1:substring(0,8)} source-time as ${source-date1:substring(8,13)} Then update attribute adds these flowfile attributes for all outgoing flowfiles from UpdateAttribute processor. In addition you can also perform same kind of operation without extracting as attributes using QueryRecord processor. Configure/Enable Record Reader/Writer controller services and use apache-calcite's Substring function to create source-date,source-time columns in the flowfile.
... View more
11-20-2018
06:54 AM
@Shu Thanks for Your answer! Using NiFi 1.6.0 --- must have been blind - indeed I haven't seen this search option, sorry. Bye!
... View more
11-15-2018
03:00 AM
@Julio Gazeta I think this thread also having same issue hitting max back pressure on the queue. Same fix as described here: https://community.hortonworks.com/questions/227489/apache-nifi-distribution-trouble-in-cluster-spark.html will be applicable for this thread also.
... View more
11-12-2018
11:52 PM
@Varun Yadav I don't think we can upload multiple templates at one time but you can keep all the templates in one folder and then read the filenames and pass each filename(using a loop) to curl api call to upload the template into NiFi canvas.
... View more
11-10-2018
08:35 PM
I have done by using executescript processor first I have taken data model { "EntityType": { "Type": "string", "Value": "" }, "EntityName": { "Type": "string", "Value": "" }, "EntityId": { "Type": "string", "Value":"" } inside execute script processor i have written logic for binding our data {"EntityType":"person","EntityName":"Ankit","EntityId":"11"} into this datamodel i am using ECMAScript for this task script :- flowFile = session.get(); if (flowFile != null) { var StreamCallback = Java.type("org.apache.nifi.processor.io.StreamCallback") var IOUtils = Java.type("org.apache.commons.io.IOUtils") var StandardCharsets = Java.type("java.nio.charset.StandardCharsets") flowFile = session.write(flowFile, new StreamCallback(function(inputStream, outputStream) { var text = IOUtils.toString(inputStream, StandardCharsets.UTF_8) var Productiondata = JSON.parse(text) //this is data like key value pair String Datamodel = flowfile.getAttribute("Datamodel"); //taking datamodel from variable for (var defaultData in defaultJsonStrObj) //after that iterating and binding it { defaultJsonStrObj[defaultData]["Value"]=Productiondata[defaultData] } outputStream.write(JSON.stringify(defaultJsonStrObj, null, '\t').getBytes(StandardCharsets.UTF_8)) } })) flowFile = session.putAttribute(flowFile, "filename", flowFile.getAttribute('filename').split('.')[0]+'_translated.json') session.transfer(flowFile, REL_SUCCESS) }
... View more
12-08-2018
09:18 PM
Thanks for your help, I got it working.
... View more
10-24-2018
02:20 PM
Thank You@Shu You find enclosed the screenShot of the two proposals...but the result is the same...
... View more
05-28-2019
03:40 PM
I have the same problem. I set the permission to 777 for all users. [nifi@hdp-srv2 ~]$ hdfs dfs -ls /warehouse/tablespace/managed/hive/
Found 3 items
drwxrwxrwx+ - hive hadoop 0 2019-05-27 14:53 /warehouse/tablespace/managed/hive/information_schema.db
drwxrwxrwx+ - hive hadoop 0 2019-05-28 13:45 /warehouse/tablespace/managed/hive/sensor_data
drwxrwxrwx+ - hive hadoop 0 2019-05-27 14:53 /warehouse/tablespace/managed/hive/sys.db
Error still happens. Caused by: org.apache.hadoop.hive.metastore.api.MetaException: java.security.AccessControlException: Permission denied: user=nifi, access=READ, inode="/warehouse/tablespace/managed/hive/sensor_data":hive:hadoop:drwxrwxrwx
... View more