Member since
10-30-2017
32
Posts
5
Kudos Received
0
Solutions
03-01-2019
09:57 PM
Hi, I have the incoming flowfiles with the json content which have multiple event types. I need to filter only certain events based on two of the JSON key/value. Sample input data: {"eventType":"Mobile","timestamp":1551280374552,"name":"ERROR"} {"eventType":"Mobile","timestamp":1551280374552,"name":"APP_START"} {"eventType":"Immobile","timestamp":1551280374552,"name":"ERROR"} {"eventType":"Immobile","timestamp":1551280374552,"name":"ERROR"} {"eventType":"Mobile","timestamp":1551280374552,"name":"PLAYBACK_ERROR"} {"eventType":"Mobile","timestamp":1551280374552,"name":"Other"} {"eventType":"MobileCrash","timestamp":1551280374552,"name":"ERROR"} Filter Condition to be applied : Flowfile content having If eventtype = "Mobile" and name = (ERROR or APP_START or PLAYBACK_ERROR) or eventtype = "MobileCrash" Also, after the filter i have to rename the json key as, eventType = vendorEventType timestamp = currentTimestamp name = someName Desired Output: {"vendorEventType":"Mobile","currentTimestamp":1551280374552,"someName":"ERROR"} {"vendorEventType":"Mobile","currentTimestamp":1551280374552,"someName":"APP_START"}{"vendorEventType":"Mobile","currentTimestamp":1551280374552,"someName":"PLAYBACK_ERROR"}{"vendorEventType":"MobileCrash","currentTimestamp":1551280374552,"someName":"ERROR"} Please help.
... View more
Labels:
- Labels:
-
Apache NiFi
02-19-2019
06:05 PM
Thanks @Matt Clarke. I went with the option 2 and it worked. Thanks again for the quick reply.
... View more
02-19-2019
07:44 AM
Hi, Am trying to replace all characters after ";" including ; in the flowfile record. I was using replacetext processor with ;.*$ as the search value and set empty value for Replacement value properties to achieve this. However it is not working out. Any inputs in this regard would be really helpful. Example: Input Replace;text;processor Expected Output: Replace Thanks, Bala.
... View more
Labels:
- Labels:
-
Apache NiFi
04-16-2018
05:45 PM
@suresh g /usr/bin/kafka-consumer-groups --zookeeper zk01.example.com:2181 --describe --group <<consumer-group-name>>. This will list all the producers under a given consumer group.
... View more
02-18-2018
09:51 PM
@Andrew Lim Below are the lines from the nifi-registry-app.log, 2018-02-18 16:46:01,149 INFO [NiFi Registry Web Server-36] o.a.n.r.w.m.ResourceNotFoundExceptionMapper org.apache.nifi.registry.exception.ResourceNotFoundException: No policy found for action='read', resource='/buckets/1f76510d-c48e-4ee6-b883-7edfcfe57e40'. Returning Not Found response.
2018-02-18 16:46:01,239 INFO [NiFi Registry Web Server-30] o.a.n.r.w.m.ResourceNotFoundExceptionMapper org.apache.nifi.registry.exception.ResourceNotFoundException: No policy found for action='write', resource='/buckets/1f76510d-c48e-4ee6-b883-7edfcfe57e40'. Returning Not Found response.
2018-02-18 16:46:01,331 INFO [NiFi Registry Web Server-15] o.a.n.r.w.m.ResourceNotFoundExceptionMapper org.apache.nifi.registry.exception.ResourceNotFoundException: No policy found for action='delete', resource='/buckets/1f76510d-c48e-4ee6-b883-7edfcfe57e40'. Returning Not Found response. s your NiFi Registry secured? Yes -that you have buckets in your Registry - Yes. have created one bucket -that your user has privilege to access the buckets in your Registry - Yes, I have assigned new policy for the bucket created Couple of questions. 1) Where do you run your nifi-registry service? on Ambari server or NiFi Nodes? 2) Do we need to specify Node identity while configuring nifi-registry?
... View more
02-17-2018
12:16 AM
AM using HDF 3.1.0 with NiFi and Nifi Registry. When i try use start version control from NiFi processors, am ending up getting Unable to obtain listing of buckets: org.apache.nifi.registry.client.NiFiRegistryException:
... View more
Labels:
- Labels:
-
Apache NiFi
-
Cloudera DataFlow (CDF)
12-22-2017
09:31 PM
1 Kudo
HI, Now its working fine. The issue is with the configuartion in EvaluateJSONPath processor. I missed a "." for the attribute to fetch. I have accepted the answer. Thanks a lot for your help!!! It should be "$.extract_date" instead of "$extract_date"
... View more
12-22-2017
08:47 PM
1 Kudo
@Shu, My Bad. I Missed to configure EvaluateJSONPath. Here is my EvaluateJSONPath Processor Config, workflow: However while configuring EvaluateJSONPath Processor, Am ending up getting the error message as given below, Hope my workflow sequencing is correct. Please correct me if am wrong.
... View more
12-22-2017
07:33 PM
@Shu, Thanks for your quick reply. After i configure the above mentioned processor[updateAttribute] same as directed, Am ending up with a directory structure mentioned below, /folder/year=/month=/day=/hour=/677944128880138. The year, month, day and hour folders do not have the proper value populated..... It should be created as /folder/year=2017/month=12/day=22/hour=19/{filename} Note : The current run have the"extract_date": as "2017-12-22 19:16:17.0" Update Processor Configuration:
... View more
12-22-2017
06:52 PM
Hi, Am ingesting the data from a MySQL DB using the executeSQL-->ConvertAvroToJSON-->PublishKafka_0_10 processor. The result will have selected columns from the Table and the extract_date. The output will look {"col1": val1, "col2": "val2", "col3": "val3", "col4": val4,"extract_date": "2017-12-21 00:17:10.0"} and the same will be stored into a Kafka topic. After which i have another workflow to consume from the kafka topic and write it into a HDFS folder. [ConsumeKafka_0_10 --> PutHDFS] My Requirement is, while consuming the messages from the kafka topic, use hour value from the extract_date field and push the messages to the corresponding hour folder in the HDFS. For Example : If the field "extract_date": is having "2017-12-21 00:17:10.0" as the value. This message should be written into HDFS under /folder/year=2017/month=12/day=21/hour=00/{filename} Is there a way to achieve this use case. Thanks, Bala
... View more
Labels:
- Labels:
-
Apache NiFi
- « Previous
-
- 1
- 2
- Next »