The more traditional approach in this situation is to use NiFi to read the incoming data and then add a NiFi processor to dump the data from the NiFi queue to either Storm or in your case SparkStreaming. Now you can build a Spark ML model test it and run it in Spark. You can push logic into NiFi but an ML model inside NiFi is overkill.
Hi @Kirk Haslbeck i have tried some of the demo packages you mentioned below but keep having issues with connecting to nifi servlet etc, are the demos compatible with 2.5? And i also get the following error.
"An internal system exception occurred: Could not find service for component, componentName=NIFI_MASTER, clusterName=Sandbox, stackInfo=HDP-2.5"
@Pierre Villard I want to send Twitter username from smartphone to HDP. When ListenHTTP get the data, it will send to GetTwitter to get some tweet according username. After that, i will use spark to calculate Machine Learning Algorithm to predict user personality.
What are you trying to build is what we call Connected Data Platform at Hortonworks. You need to understand that you have two types of workloads/requirements and you need to use HDF and HDP jointly.
ML model construction: the first step towards you goal is to build your machine learning model. This require processing lot of historical data (data at rest) to detect some pattern related to what you are trying to predict. This phase is called "training phase".The best tool do this is HDP and more specifically Spark.
Applying the ML model: once step1 completed, you will have a model that you can apply to new data to predict something. In my understanding you want to apply this at real time data coming from twitter (data at motion). To get the data in real time and transform to what the ML model needs, you can use NiFi. Next, NiFi send the data to Storm or Spark Streaming that applies the model and get the prediction.
So you will have to use HDP to construct the model, HDF to get and transform the data, and finally a combination of HDF/HDP to apply the model and make the prediction.
To build a web service with NiFi you need to use several processors: one to listen to incoming requests, one or several processors to implement your logic (transformation, extraction, etc), one to publish the result. You can check this page that contains several data flow examples. The "Hello_NiFi_Web_Service.xml" gives an example on how to do it.
I have tried building a web service with NiFi and am able to get the incoming requests and pass it to Spark/Storm. Assuming that I compute the prediction inside Spark, I wish to know how to send back the score/result as response to NiFi.
If that is currently not possible, what are the chances of creating a custom processor in R to predict the scores and pass it on as response?