Community Articles
Find and share helpful community-sourced technical articles
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.
Labels (2)
Guru

Moving to more advanced application of our basic monitoring flow.

Picking up where we left off in the first part of this series: Monitor Temperature & Humidity Sensors with Apache NiFi

Now that we have the ability to collect the sensor data using NiFi, we will want to be able to store and process the data using our HDP cluster. The resources at the sensor level would not be enough to provide deep analytics or long term storage. For this reason we can leverage NiFi's site-2-site protocol to send the sensor events directly to our HDF cluster for processing.

Start of by adding a remote process group to your NiFi flow at the sensor end and feed the output of ExecuteProcess into this remote processor. Supply the URL of your HDF NiFi instance (same as you would type into browser to get to the design gui).

62521-screen-shot-2018-02-23-at-93005-am.png

Now move to your remote instance of NiFi on your HDF cluster and create a new input port. replace the execute process processor with the input port. Now when you start the processors on both nifi instances your NiFi instance at the "edge" gathering data directly from the sensor will send the data directly to your new input port.

62523-screen-shot-2018-02-23-at-93713-am.png

I will be taking this data and pushing it directly into Solr for indexing and use with a banana dashboard.

You may want to do other things with your data and later articles in this series will cover other applications, for example pushing to Hbase or Kafka and streaming into Druid. You can also write to HDFS or Hive.

For now Solr will be what we use.

Create a collection in Solr and provide the name of that collection to the PutSolrContentStream. This will start to populate your Solr collection with sensor events. To make this data more useful to us we will need to also collect the timestamp that each event is collected at. I have done this by modifying the python script to include an extra field in the json. You may decide to leverage NiFi for this. All code and templates can be quickly obtained by cloning the accompanied git repo to your machine. arduino-nifi-dht

# -*- coding: utf-8 -*-
"""
Created on Thu Feb 22 15:54:50 2018


@author: vvagias
"""
import serial
import json
import time
from time import gmtime, strftime


ser = serial.Serial('/dev/cu.usbmodem1411', 9600)
a = ser.readline().decode('utf8').replace("\n","").split(',')
js = {
"temperature" : float(a[1]),
"humidity" : float(a[0]),
"time" : strftime("%Y-%m-%d %H:%M:%S", gmtime())
}


print(json.dumps(js))


Now we have everything we need to get a solid dashboard put together to monitor these sensor events in real time.

Move to your banana instance and either create a new time series dashboard and start having fun with whatever you desire... Or you can follow along a bit further and upload the template included in the git repo located in the banana directory.

Upload this template to your instance and edit the dashboard Solr section to point to the collection you specified in the earlier step.

Click save and you should have a dashboard that looks like this:

62524-screen-shot-2018-02-23-at-95315-am.png

Well... If that doesn't put a smile on your face then you are definitely reading the wrong article You can modify the dashboard and make it show what you are interested in then just click save at the top right and set as default so you always have it when you reload the page.

I hope you enjoyed the article and gained some value from what we did.

I also hope you will upvote the article if you found it useful and check out the other articles that follow in the series!

445 Views
Don't have an account?
Coming from Hortonworks? Activate your account here
Version history
Revision #:
2 of 2
Last update:
‎08-17-2019 08:48 AM
Updated by:
Guru vnv Guru
 
Contributors
Top Kudoed Authors