Member since
04-11-2016
471
Posts
325
Kudos Received
118
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2133 | 03-09-2018 05:31 PM | |
2696 | 03-07-2018 09:45 AM | |
2595 | 03-07-2018 09:31 AM | |
4466 | 03-03-2018 01:37 PM | |
2511 | 10-17-2017 02:15 PM |
09-07-2016
06:32 PM
Are you sure that you don't ingest multiple times the same file with List/FetchSFTP?
... View more
09-07-2016
04:03 PM
Please add a GenerateFlowFile processor before InvokeHttp processor, and connect GFF to InvokeHttp. You need to have flow files generated going into InvokeHttp processor to have requests performed against your API. You are correct for the method if your API asks for a POST method.
... View more
09-07-2016
03:50 PM
Did you start the processors? on your screenshot, all the processors are stopped.
... View more
09-07-2016
03:34 PM
With your current flow, you need to generate some flow files to initiate the flow. For this purpose you can use GenerateFlowFile with 0B of data and with a custom scheduling strategy to request your API every X seconds/minutes/hours, etc... And you connect it to InvokeHttp processor.
... View more
09-07-2016
01:08 PM
1 Kudo
Hi, Using the InvokeHTTP processor you just have to put your client_id in basic authentication username property and your client_secret in basic authentication password property. Everything else will be taken care of by the processor regarding the basic authentication. Beside these properties, you have to give the API URL to request and you should be good to go. The content of the response to your request will be inside the flow files at the output of this processor.
... View more
09-05-2016
02:36 PM
You may want to launch a spark-shell, check the version 'sc.version', check the instantiation of contexts/session and run some SQL queries.
... View more
09-05-2016
02:11 PM
Then I'd try the following: vertices.map(_.split(" ")).saveAsTextFile("my/hdfs/path/directory")
... View more
09-05-2016
02:06 PM
1 Kudo
Hi, What is the exact line of code you have? It let me think that you would like to save the content of your RDD/DF/DS but you are calling this method although you are not manipulating such an object anymore.
... View more
09-05-2016
06:35 AM
1 Kudo
Hi, I think this is the information you are looking for: https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#provenance-repository
... View more
09-05-2016
06:33 AM
4 Kudos
Hi, The recommended way is to use GenerateTableFetch as input processor of QueryDatabaseTable processor. It generates flow files with SQL query to execute. This way, it also allows you to balance the load if you are in a NiFi cluster. In this processor, you can set the partition size to limit the number of rows of each request. The error you have suggests something different. Maybe the JDBC driver is not fully implemented and is not supporting the properties of this processor. But it does sound to me like a memory issue. Have a try with GenerateTableFetch.
... View more