HDF powered by Apache NiFi, Kafka and Storm, is an integrated system for real-time dataflow management and streaming analytics on-premise or in the cloud. SAS Event Stream Processing is a real-time, low-latency, high-throughput event processing solution that can deploy SAS machine learning models. By integrating these technologies, organizations now have the option of deploying their SAS models in real-time within the Hortonworks platform. This offers flexible deployment options for your streaming analytics projects, while providing powerful analytics from SAS.
How does this integration work?
There are two new processors that can be added to NiFi:
ListenESP: This processor initiates a listener within NiFi that receives events from the SAS Event Stream Processing data stream.
PutESP: This processor sends events from NiFi to the SAS Event Stream Processing data stream.
NOTE: For this to work, SAS Event Stream Processing must be purchased and have a valid license.
Once the .nar file has been added, you will have access to the two processors within NiFi. Data events are shared using an Avro schema. Below is a basic example of a NiFi dataflow using both a ListenESP and PutESP (Shown in
Within the PutESP processor, you'll notice a few parameters (shown below in
Pub/Sub Host: Hostname or IP of the server running SAS Event Stream Processing.
Pub/Sub Port: Pubsub port of the SAS Event Stream Processing engine.
Project: SAS Event Stream Processing project name.
Continuous Query: Name of the continuous query within the SAS Event Stream Processing project.
Source Window: Source window within SAS Event Stream Processing where events from NiFi can be injected.
The ListenESP processor has similar parameters (shown below in