Member since
05-31-2018
31
Posts
2
Kudos Received
0
Solutions
10-02-2018
07:47 AM
Hi, I am using RouteONAttribute Processor and i defined a property and its value contains multiple AND conditions like the following: Property Name: ROUTE1 Property Value: ${text.tag.type:equals('XXX'):and(${txt:contains('AAA')}):and(${txt:contains('BBB')})} The Issue i am facing is that after generating a flow file with attributes text.tag.type = XXX and txt = AAA,BBB it always route to unmatched while it should route to ROUTE1 Thanks ...
... View more
Labels:
- Labels:
-
Apache NiFi
10-01-2018
05:07 PM
Hi, I am using RouteONAttribute Processor and i defined a property and its value contains multiple AND conditions like the following: Property Name: ROUTE1 Property Value: ${text.tag.type:equals('XXX'):and(${txt:contains('AAA')}):and(${txt:contains('BBB')})} The Issue i am facing is that after generating a flow file with attributes text.tag.type = XXX and txt = AAA,BBB it always route to unmatched while it should route to ROUTE1 Thanks ...
... View more
Labels:
- Labels:
-
Apache NiFi
09-30-2018
08:47 AM
This has been solved by creating multiple input port inside NiFi process group
... View more
09-30-2018
08:24 AM
Hi, I moved some of the NiFi flow to a Process Group, the input of the process group is 3 relations which goes to the input port of the process group, how can i get this 3 relations separatly inside my process group. Thanks,.
... View more
Labels:
- Labels:
-
Apache NiFi
09-25-2018
08:45 AM
Hi, I need to apply regular expression on file file attribute not on flow file content using ExtractText processor. How can i achive this. Thanks,,
... View more
Labels:
- Labels:
-
Apache NiFi
09-18-2018
07:38 PM
2 Kudos
Hi, I am using RouteONAttribute Processor and i defined a property and its value contains multiple OR and AND conditions like the following: Property Name: ROUTE1 Property Value: ${text.tag.type:equals('XXX'):and(${txt:contains('YES')}):or(${text.tag.type:equals('YYY')}):and(${txt:contains('NO')})} The Issue i am facing is that after generating a flow file with attributes text.tag.type = XXX and txt = YES CORRECT it always route to unmatched while it should route to ROUTE1 Thanks ...
... View more
Labels:
- Labels:
-
Apache NiFi
09-10-2018
04:43 PM
Hi, I used SAM to build streaming application. the source is KAFKA Topic and the destination (Sink) is another KAFKA Topic and in between i used Rule to filter specific events, also I created Avro schema's using schema manager (two schema one for input stream and one for output stream) and i name them as the name of the KAFKA Topics. them i puplished Avro messages to the KAFKA source Topic. After Running the application no message inserted in Sink KAFKA topic and CPU is consumed a lot and error shown in the streaming application box as show in the pictures and i don't know how to check this errors (nothing shown in the streamline.log). in Strom UI, this error shown: Id Executors Tasks Emitted Transferred Complete latency (ms) Acked Failed Error Host Error Port Last error Error Time 3-KAFKA 1 1 0 0 0.000 0 0 sandbox-hdf.hortonworks.com 6700 com.hortonworks.registries.schemaregistry.exceptions.RegistryException: java.lang.RuntimeException: com.google.common.util.concurrent.UncheckedExecutionException: javax.ws.rs.NotFoundException: HTTP 4 Mon, 10 Sep 2018 19:54:18 Errors Search:
Time Error Host Error Port Error 2018-09-10T16:54:28.000Z sandbox-hdf.hortonworks.com 6700 com.hortonworks.registries.schemaregistry.exceptions.RegistryException: java.lang.RuntimeException: com.google.common.util.concurrent.UncheckedExecutionException: javax.ws.rs.NotFoundException: HTTP 404 Not Found at com.hortonworks.registries.schemaregistry.serde.AbstractSnapshotDeserializer.deserialize(AbstractSnapshotDeserializer.java:156) at com.hortonworks.streamline.streams.runtime.storm.spout.AvroKafkaSpoutTranslator.apply(AvroKafkaSpoutTranslator.java:61) at org.apache.storm.kafka.spout.KafkaSpout.emitOrRetryTuple(KafkaSpout.java:506) at org.apache.storm.kafka.spout.KafkaSpout.emitIfWaitingNotEmitted(KafkaSpout.java:474) at org.apache.storm.kafka.spout.KafkaSpout.nextTuple(KafkaSpout.java:341) at org.apache.storm.daemon.executor$fn__9551$fn__9566$fn__9599.invoke(executor.clj:660) at org.apache.storm.util$async_loop$fn__555.invoke(util.clj:484) at clojure.lang.AFn.run(AFn.java:22) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.RuntimeException: com.google.common.util.concurrent.UncheckedExecutionException: javax.ws.rs.NotFoundException: HTTP 404 Not Found at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.getSchemaVersionInfo(SchemaRegistryClient.java:616) at com.hortonworks.registries.schemaregistry.serde.AbstractSnapshotDeserializer.deserialize(AbstractSnapshotDeserializer.java:153) ... 8 more Caused by: com.google.common.util.concurrent.UncheckedExecutionException: javax.ws.rs.NotFoundException: HTTP 404 Not Found at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2203) at com.google.common.cache.LocalCache.get(LocalCache.java:3937) at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) at com.hortonworks.registries.schemaregistry.cache.SchemaVersionInfoCache.getSchema(SchemaVersionInfoCache.java:103) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.getSchemaVersionInfo(SchemaRegistryClient.java:612) ... 9 more Caused by: javax.ws.rs.NotFoundException: HTTP 404 Not Found at org.glassfish.jersey.client.JerseyInvocation.convertToException(JerseyInvocation.java:1008) at org.glassfish.jersey.client.JerseyInvocation.translate(JerseyInvocation.java:816) at org.glassfish.jersey.client.JerseyInvocation.access$700(JerseyInvocation.java:92) at org.glassfish.jersey.client.JerseyInvocation$2.call(JerseyInvocation.java:700) at org.glassfish.jersey.internal.Errors.process(Errors.java:315) at org.glassfish.jersey.internal.Errors.process(Errors.java:297) at org.glassfish.jersey.internal.Errors.process(Errors.java:228) at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:444) at org.glassfish.jersey.client.JerseyInvocation.invoke(JerseyInvocation.java:696) at org.glassfish.jersey.client.JerseyInvocation$Builder.method(JerseyInvocation.java:420) at org.glassfish.jersey.client.JerseyInvocation$Builder.get(JerseyInvocation.java:316) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient$16.run(SchemaRegistryClient.java:1083) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient$16.run(SchemaRegistryClient.java:1080) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:360) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.getEntity(SchemaRegistryClient.java:1080) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.doGetSchemaVersionInfo(SchemaRegistryClient.java:639) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.access$100(SchemaRegistryClient.java:141) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient$1.retrieveSchemaVersion(SchemaRegistryClient.java:243) at com.hortonworks.registries.schemaregistry.cache.SchemaVersionInfoCache$1.load(SchemaVersionInfoCache.java:71) at com.hortonworks.registries.schemaregistry.cache.SchemaVersionInfoCache$1.load(SchemaVersionInfoCache.java:63) at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527) at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282) at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) ... 14 more Thanks..
... View more
Labels:
- Labels:
-
Apache Kafka
-
Cloudera DataFlow (CDF)
08-10-2018
09:28 PM
Hi @Abdelkrim Hadjidj In our NiFi flow we need to store some flow files which represent UN-completed messages to KAFKA Topic, after that we need to query our KAFKA topic from NiFi Flow to check the messaged is completed or not, if the messages completed we will continue the NiFi flow. So we need to use the KSQL to query the KAFKA TOPIC for the messages completed to process them. Your feedback is highly appreciated.
... View more
08-10-2018
04:21 PM
Hi @Abdelkrim Hadjidj So for KSQL, should we implement custom NiFi's processor. Thanks,
... View more
08-10-2018
04:18 PM
Hi, We need to execute KSQL querier to filter our KAFKA streams within NiFi. How can we achive this . Should we implement custom processor. Regards,
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi
08-10-2018
04:15 PM
Hi, We need to execute KSQL querier to filter our KAFKA streams within NiFi. How can we achive this . Should we implement custom processor. Regards,
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi
08-05-2018
03:41 PM
Hi, Need to parse binary ASN.1 BER .Please suggest to do in Apache NIFI. Regards,
... View more
Labels:
- Labels:
-
Apache NiFi
07-17-2018
09:03 PM
Hi, After reading some articles, I get confused, what is the defferance between Zookeper and NCM. what should we use for creating NiFi Cluster. Thanks.,,
... View more
- Tags:
- ncm
Labels:
- Labels:
-
Apache NiFi
06-28-2018
10:50 PM
Hi, We prepared 3 VM 16 GB RAM each (Ubuntu 16 - 64bit) and we need to deploy HDF 3.1.1 Docker image to them to test HDF cluster. Questions: 1. Should we install Docker on all the three VMs then deploy HDF 3.1.1 Docker image to all the 3 VMs then after that we create NiFI cluster through AMBARI. 2. Do we need to make one VM as Master HDF node and the other two VMs slave Nodes for the HDF cluster. 3. Can we install MySQL Cluster on 2 VMs out of those 3 VMs to store our flows data. 4. For production deployment, can we relay on HDF docker image or we need to deploy HDF from scratch by installing HDF packages. Thanks. Regards,
... View more
Labels:
- Labels:
-
Cloudera DataFlow (CDF)
06-27-2018
02:18 PM
Hi, We are planning to deploy HDF 3.1.0 sandbox to two VMs to have clustered NiFi and we have the following questions: 1. Does the HDF sandbox contains MySQL as part of the sandbox. 2. Can we have our data node (MySQL DB) inside the smae VM which has the HDF. 2. Does HDF docker image can be deployed to Clustered Linux VMs. 3. If we deployed two HDF to two VMs and we manage to create NiFI cluster, can we have MySQL cluster in this case. 4. What is the recommended VM Specs in term of RAM, CPU. Thanks. Regards,
... View more
Labels:
- Labels:
-
Cloudera DataFlow (CDF)
06-13-2018
08:44 PM
@Bryan Bende Thanks for you support. yes it is duo to the back-pressure
... View more
06-13-2018
06:08 PM
Hi, I have the following data flow and the WW processor is not pulling data from the queue , how can i force the WW processor to pull flowfiles from the queue between it and its predecessor processor (XX processor). optimized-nifi8.jpg Thanks,,,,
... View more
Labels:
- Labels:
-
Cloudera DataFlow (CDF)
06-11-2018
06:48 PM
thanks. works perfectly
... View more
06-11-2018
05:28 PM
Thanks it works.
... View more
06-11-2018
05:27 PM
Hi, How to move the flowfiles from one queue/relationship to another one through NiFi Web Interface. Thanks,,
... View more
Labels:
- Labels:
-
Cloudera DataFlow (CDF)
06-10-2018
05:55 PM
Hi, I have 100K flowfiles generated by custom processor and i need to store them to mySQL DB, I need to process the 100k flowfiles by multiple ConvertJsonToSQL Processors concurrently to speed the insertion process. what is the processor that i should use between the custom processor and ConvertJsonToSQL processors (4 ConvertJsonToSQL processors) in order to achieve that. Thanks,,
... View more
Labels:
- Labels:
-
Cloudera DataFlow (CDF)
06-04-2018
07:49 AM
Excellent. it works. Thanks Mr. Jay
... View more
05-31-2018
02:09 PM
Hi, I downloaded HDF 3.1 and i get NIFI java.lang.OutOfMemoryError: Java heap space after running my custom processor, I changed the default JVM memory in Bootstrap.conf from -Xms512m -Xmx512m to -Xms1g -Xmx1g and i restarted NiFi to take the new values but when i check Bootstrap.conf after restarting NiFi it reset the values to -Xms512m -Xmx512m. why it reset the values to the old values and it didn't take the new values. Your help is highly appreciated.
... View more
- Tags:
- hdf-3.1
Labels:
- Labels:
-
Cloudera DataFlow (CDF)