Member since
09-25-2015
112
Posts
37
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1992 | 12-21-2016 09:31 AM |
08-11-2016
01:13 PM
1 Kudo
It should be in nifi-app.log... in the code it does: context.getBulletinRepository().addBulletin(bulletin);
logger.warn(message); The logger is a standard SLF4J logger which ends up being logback controlled by the logback.xml in the conf directory: Logger logger = LoggerFactory.getLogger(MonitorDiskUsage.class);
... View more
08-10-2016
01:19 PM
1 Kudo
In MergeContent there is a Delimiter Strategy, choose "Text" which means it uses the values type in to Header, Demarcator, and Footer. The Demarcator is what gets put between each FlowFile that is merged together. You can enter a new line with shift+enter.
... View more
07-26-2016
01:01 PM
1 Kudo
You can use the embedded Zookeepers in the Nifi process to create a zookeeper cluster, or you can use an external zookeeper cluster (either the one you're using for Kafka, or a different one). Zookeeper synchronises the state within it's cluster.
... View more
07-26-2016
11:15 AM
@Simon Elliston BallDoes Nifi store offsets in memory as well. Just wondering how will it make sure not to read same offsets or duplicate messages from kafka? thanks
... View more
07-15-2016
10:08 AM
@Matt Burgess i think you missed this bit but just checking with you again is this an effective solution to import bulk batch data from SQL server or mongodb using nifi to Amazon S3 and also keeping it in sync with updates and deletes? Is nifi designed for this purpose? We are looking to run these sync updates or deletes as overnight jobs in nifi? Plz reply thank you
... View more
07-25-2016
03:08 PM
@Shishir Saxena Thanks for the answer. How error handling is built in NiFi flows? Whats content & flowfile repository i mean which data is stored in these repositories by Nifi? Thank you
... View more
07-12-2016
02:06 PM
3 Kudos
This example uses site-to-site to send logs from one instance to another using a push model: https://github.com/bbende/nifi-streaming-examples The first instance has a Remote Process Group pointing to the URL of the second instance. The second instance has an Input Port to receive logs, which the Remote Process Group from the first instance is connected to. You also need to enable nifi.remote.input.socket.port and nifi.remote.input.secure on the node where the an incoming connection will be made. In this example that is the node with the Input Port which will receive an incoming connection from the Remote Process Group on the first node.
... View more
07-07-2016
03:36 PM
1 Kudo
In EvaluateJsonPath, you can choose "flowfile-attribute" as the Destination, then the original JSON will still be in the flow file content, and any extracted JSON elements will be in the flowfile's attributes. That can go into RouteOnAttribute for "eventname". Then you can use ReplaceText (or ExecuteScript if you prefer) to create a CQL statement using Expression Language to insert the values from your attributes, or to wrap the entire JSON object in a CQL statement. I have a template that uses ReplaceText to put an entire JSON object into a "INSERT INTO myTable JSON" CQL statement, it is available as a Gist (here). It doesn't have a PutCassandraQL processor at the end, instead its a LogAttribute processor so you can see if the CQL looks right for what you're trying to do.
... View more
07-07-2016
01:47 PM
@Simon Elliston Ball PutCassandraQL doesnt have a in built property to write CQL. So i used one of the example templates from github which uses ExecuteScript processor and the values inserted through executescript processor are hardcoded and i want to use my json values i.e. insert values from my Json event not hardcoded values. Any suggestions plz. https://github.com/hortonworks-gallery/nifi-templates/blob/master/templates/CassandraProcessors.xml Currently iam using this flow to put json in cassandra. GetKafka->EvaluateJson->RouteOnAttribute->ExecuteScript->PutCassandraSQL.
... View more
07-24-2018
04:11 PM
Ha thanks, I was using my SplitRecord as a ConvertRecord by using all rows in the sql response. This is better solution than mine above.
... View more
- « Previous
-
- 1
- 2
- Next »