Member since
01-11-2016
355
Posts
230
Kudos Received
74
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
8292 | 06-19-2018 08:52 AM | |
3215 | 06-13-2018 07:54 AM | |
3668 | 06-02-2018 06:27 PM | |
3963 | 05-01-2018 12:28 PM | |
5506 | 04-24-2018 11:38 AM |
10-15-2017
08:46 AM
1 Kudo
@Dhamotharan P It looks like your flow is never triggered. InvokeHTTP is waiting for an incoming flow file that will be added to the body of your http call. Add a GenerateFlowFile before your invoke just to trigger it. You should see flow files going to failure relationship. Can you test that ?
... View more
10-14-2017
07:40 PM
2 Kudos
@Data Addicts You can use the UpdateCounter processor or the state feature of UpdateAttribute processor to do it: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.4.0/org.apache.nifi.processors.standard.UpdateCounter/index.html https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-update-attribute-nar/1.4.0/org.apache.nifi.processors.attributes.UpdateAttribute/index.html
... View more
10-14-2017
02:26 PM
@suresh krish What do you mean by handle clear text passwords ? If you want to protect passwords in configuration files and don't let them clear text on disque then you can use the credential provider api https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CredentialProviderAPI.html You can create a jckes key and store your passwords on it and reference them in your conf file. The keystore can be stored on hdfs to be accessible for all nodes or local (to avoid closed loops) Is this what you are looking for?
... View more
10-13-2017
12:03 PM
@xav webmaster Can you test RouteOnContent with the property configured as follows : This routes flow files based on the fact that a line contains "Allow" or not. You will have two relations : Allow and unmatched
... View more
10-13-2017
11:13 AM
@xav webmaster You need to generate flow file attributes based on the content of your data. These attrbutes will be used for routing in RouteOnAttribute. To construct these attributes you can use UpdateAttribute/ExtractText with Expression Langage. If you just want to check if the line contains Allow or Deny you can use contains : https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#contains If you want to verify a particular field then you need to parse the line. There are several methods to do it. For instance, you can use the function getDelimitedField since it's a CSV. Look also to record based processor. They have a ton of optimisation that can help you. https://community.hortonworks.com/articles/102183/record-based-processors-in-apache-nifi-12.html https://blogs.apache.org/nifi/entry/record-oriented-data-with-nifi
... View more
10-12-2017
08:22 PM
1 Kudo
@Patrick Maggiulli From the documentation of the AvroSchemaRegistry it looks like the actual schema should be given to the registry: 'value' represents the textual representation of the actual schema following the syntax and semantics of Avro's Schema format. ${inferred.avro.schema} is an attribute of the flow file and doesn't make sens for the registry. To implement your use case, you should use "use schema text property" as a schema access strategy. It's more suitable for your use case with dynamic schemas. This way, the schema can be read from the flow file and used for the conversion. A schema registry is more for governance so you will be adding and managing schemas manually. Configure your CSVReader like below And your JSONRecordSetWritter like below I tried it on your flow/data and it's working. Dos this helps?
... View more
10-12-2017
01:13 PM
@Simon Jespersen You can use UpdateAttribute and create a new attribute with the following EL : ${filename:getDelimitedField(3,'_')}
... View more
10-12-2017
09:21 AM
@spdvnz There's no specific processor for Azure SQL DB but you can use it with standard JDBC processor in NiFi. You can use a Database Connection Pooling Service with JDBC driver for Azure SQL DB and use processor like PutDatabaseRecord or any other JDBC processor to ingest data.
... View more
10-12-2017
08:30 AM
1 Kudo
@Timothy Spann You can change schema between read and write for UpdateRecord processor. You just need to use different schemas for your reader/writer and reference the new fields in your processor to set their values. There's an example in additional info section of the doc : https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.4.0/org.apache.nifi.processors.standard.UpdateRecord/additionalDetails.html In the given example you can see that field gender has been added
... View more
10-11-2017
06:20 AM
1 Kudo
@Foivos A Why not use UnpackContent or CompressContent instead of Execute process ? https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.4.0/org.apache.nifi.processors.standard.UnpackContent/index.html https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.4.0/org.apache.nifi.processors.standard.CompressContent/index.html Does UnpackContent suits your need ?
... View more