1973
Posts
1225
Kudos Received
124
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1914 | 04-03-2024 06:39 AM | |
| 3011 | 01-12-2024 08:19 AM | |
| 1643 | 12-07-2023 01:49 PM | |
| 2420 | 08-02-2023 07:30 AM | |
| 3361 | 03-29-2023 01:22 PM |
12-27-2017
10:53 PM
2 Kudos
There is an open source model server for Apache MXNet that I recently tried. It's very easy to install and use. You must have Apache MXNet and Python installed.
Installation and Setup
To install the Model Server it's a simple. I am using Pip3 to make sure I install to Python3 as I also have Python 2.7 installed on my laptop.
pip3 install mxnet --pre --user
pip3 install mxnet-model-server
pip3 install imdbpy
pip3 install dataset
http://127.0.0.1:9999/api-description
{
"description": {
"host": "127.0.0.1:9999",
"info": {
"title": "Model Serving Apis",
"version": "1.0.0"
},
"paths": {
"/api-description": {
"get": {
"operationId": "api-description",
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "OK",
"schema": {
"properties": {
"description": {
"type": "string"
}
},
"type": "object"
}
}
}
}
},
"/ping": {
"get": {
"operationId": "ping",
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "OK",
"schema": {
"properties": {
"health": {
"type": "string"
}
},
"type": "object"
}
}
}
}
},
"/squeezenet/predict": {
"post": {
"consumes": [
"multipart/form-data"
],
"operationId": "squeezenet_predict",
"parameters": [
{
"description": "data should be image which will be resized to: [3, 224, 224]",
"in": "formData",
"name": "data",
"required": "true",
"type": "file"
}
],
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "OK",
"schema": {
"properties": {
"prediction": {
"type": "string"
}
},
"type": "object"
}
}
}
}
}
},
"schemes": [
"http"
],
"swagger": "2.0"
}
}
http://127.0.0.1:9999/ping
{
"health": "healthy!"
}
Because each server can specify a port, you can have many running at once. I am running two at once. One for SSD and one for SqueezeNet. In the MXNet Model Server github you will find a Model Zoo containing many image processing libraries and examples.
mxnet-model-server --models squeezenet=https://s3.amazonaws.com/model-server/models/squeezenet_v1.1/squeezenet_v1.1.model --service mms/model_service/mxnet_vision_service.py --port 9999
/usr/local/lib/python3.6/site-packages/mms/service_manager.py:14: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
import imp
[INFO 2017-12-27 08:50:23,195 PID:50443 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:__init__:87] Initialized model serving.
Downloading squeezenet_v1.1.model from https://s3.amazonaws.com/model-server/models/squeezenet_v1.1/squeezenet_v1.1.model.
[08:50:26] src/nnvm/legacy_json_util.cc:190: Loading symbol saved by previous version v0.8.0. Attempting to upgrade...
[08:50:26] src/nnvm/legacy_json_util.cc:198: Symbol successfully upgraded!
[INFO 2017-12-27 08:50:26,701 PID:50443 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:add_endpoint:182] Adding endpoint: squeezenet_predict to Flask
[INFO 2017-12-27 08:50:26,701 PID:50443 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:add_endpoint:182] Adding endpoint: ping to Flask
[INFO 2017-12-27 08:50:26,702 PID:50443 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:add_endpoint:182] Adding endpoint: api-description to Flask
[INFO 2017-12-27 08:50:26,703 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric errors for last 30 seconds is 0.000000
[INFO 2017-12-27 08:50:26,703 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric requests for last 30 seconds is 0.000000
[INFO 2017-12-27 08:50:26,703 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric cpu for last 30 seconds is 0.335000
[INFO 2017-12-27 08:50:26,704 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric memory for last 30 seconds is 0.005696
[INFO 2017-12-27 08:50:26,704 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric disk for last 30 seconds is 0.656000
[INFO 2017-12-27 08:50:26,704 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric overall_latency for last 30 seconds is 0.000000
[INFO 2017-12-27 08:50:26,705 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric inference_latency for last 30 seconds is 0.000000
[INFO 2017-12-27 08:50:26,705 PID:50443 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric preprocess_latency for last 30 seconds is 0.000000
[INFO 2017-12-27 08:50:26,720 PID:50443 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:start_model_serving:101] Service started successfully.
[INFO 2017-12-27 08:50:26,720 PID:50443 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:start_model_serving:102] Service description endpoint: 127.0.0.1:9999/api-description
[INFO 2017-12-27 08:50:26,720 PID:50443 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:start_model_serving:103] Service health endpoint: 127.0.0.1:9999/ping
[INFO 2017-12-27 08:50:26,730 PID:50443 /usr/local/lib/python3.6/site-packages/werkzeug/_internal.py:_log:87] * Running on http://127.0.0.1:9999/ (Press CTRL+C to quit)
For the SSD example, I fork the AWS Github (https://github.com/awslabs/mxnet-model-server.git) and change directory to the example/ssd directory and follow the setup to prepare the model.
mxnet-model-server --models SSD=resnet50_ssd_model.model --service ssd_service.py --port 9998
/usr/local/lib/python3.6/site-packages/mms/service_manager.py:14: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
import imp
[INFO 2017-12-27 09:02:22,800 PID:55345 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:__init__:87] Initialized model serving.
[INFO 2017-12-27 09:02:24,510 PID:55345 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:add_endpoint:182] Adding endpoint: SSD_predict to Flask
[INFO 2017-12-27 09:02:24,510 PID:55345 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:add_endpoint:182] Adding endpoint: ping to Flask
[INFO 2017-12-27 09:02:24,511 PID:55345 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:add_endpoint:182] Adding endpoint: api-description to Flask
[INFO 2017-12-27 09:02:24,511 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric errors for last 30 seconds is 0.000000
[INFO 2017-12-27 09:02:24,512 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric requests for last 30 seconds is 0.000000
[INFO 2017-12-27 09:02:24,512 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric cpu for last 30 seconds is 0.290000
[INFO 2017-12-27 09:02:24,513 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric memory for last 30 seconds is 0.014777
[INFO 2017-12-27 09:02:24,513 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric disk for last 30 seconds is 0.656000
[INFO 2017-12-27 09:02:24,513 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric overall_latency for last 30 seconds is 0.000000
[INFO 2017-12-27 09:02:24,514 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric inference_latency for last 30 seconds is 0.000000
[INFO 2017-12-27 09:02:24,514 PID:55345 /usr/local/lib/python3.6/site-packages/mms/metric.py:start_recording:118] Metric preprocess_latency for last 30 seconds is 0.000000
[INFO 2017-12-27 09:02:24,514 PID:55345 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:start_model_serving:101] Service started successfully.
[INFO 2017-12-27 09:02:24,514 PID:55345 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:start_model_serving:102] Service description endpoint: 127.0.0.1:9998/api-description
[INFO 2017-12-27 09:02:24,514 PID:55345 /usr/local/lib/python3.6/site-packages/mms/mxnet_model_server.py:start_model_serving:103] Service health endpoint: 127.0.0.1:9998/ping
[INFO 2017-12-27 09:02:24,524 PID:55345 /usr/local/lib/python3.6/site-packages/werkzeug/_internal.py:_log:87] * Running on http://127.0.0.1:9998/ (Press CTRL+C to quit)
http://127.0.0.1:9998/api-description
{
"description": {
"host": "127.0.0.1:9998",
"info": {
"title": "Model Serving Apis",
"version": "1.0.0"
},
"paths": {
"/SSD/predict": {
"post": {
"consumes": [
"multipart/form-data"
],
"operationId": "SSD_predict",
"parameters": [
{
"description": "data should be image which will be resized to: [3, 512, 512]",
"in": "formData",
"name": "data",
"required": "true",
"type": "file"
}
],
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "OK",
"schema": {
"properties": {
"prediction": {
"type": "string"
}
},
"type": "object"
}
}
}
}
},
"/api-description": {
"get": {
"operationId": "api-description",
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "OK",
"schema": {
"properties": {
"description": {
"type": "string"
}
},
"type": "object"
}
}
}
}
},
"/ping": {
"get": {
"operationId": "ping",
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "OK",
"schema": {
"properties": {
"health": {
"type": "string"
}
},
"type": "object"
}
}
}
}
}
},
"schemes": [
"http"
],
"swagger": "2.0"
}
}
http://127.0.0.1:9998/ping
{
"health": "healthy!"
}
With this call to Squeeze net we get some classes of guesses and probabilities (0.50 -> 50%).
curl -X POST http://127.0.0.1:9999/squeezenet/predict -F "data=@TimSpann2.jpg"
{
"prediction": [
[
{
"class": "n02877765 bottlecap",
"probability": 0.5077430009841919
},
{
"class": "n03196217 digital clock",
"probability": 0.35705313086509705
},
{
"class": "n03706229 magnetic compass",
"probability": 0.02305465377867222
},
{
"class": "n02708093 analog clock",
"probability": 0.018635360524058342
},
{
"class": "n04328186 stopwatch, stop watch",
"probability": 0.015588048845529556
}
]
]
}
With this test call to SSD, you will see it identifies a person (me) and provides coordinates of a box around me.
curl -X POST http://127.0.0.1:9998/SSD/predict -F "data=@TimSpann2.jpg"
{
"prediction": [
[
"person",
405,
139,
614,
467
],
[
"boat",
26,
0,
459,
481
]
]
}
/opt/demo/curl.sh
curl -X POST http://127.0.0.1:9998/SSD/predict -F "data=@$1"
/opt/demo/curl2.sh
curl -X POST http://127.0.0.1:9999/squeezenet/predict -F "data=@$1"
The Apache NiFi flow is easy, I call the REST URL and pass an image. This can be done with a Groovy Script or by executing a Curl Shell.
Logs From Run
[INFO 2017-12-27 17:41:33,447 PID:90860 /usr/local/lib/python3.6/site-packages/werkzeug/_internal.py:_log:87] 127.0.0.1 - - [27/Dec/2017 17:41:33] "POST /SSD/predict HTTP/1.1" 400 -
[INFO 2017-12-27 17:41:36,289 PID:90860 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:predict_callback:440] Request input: data should be image with jpeg format.
[INFO 2017-12-27 17:41:36,289 PID:90860 /usr/local/lib/python3.6/site-packages/mms/request_handler/flask_handler.py:get_file_data:133] Getting file data from request.
[INFO 2017-12-27 17:41:37,035 PID:90860 /usr/local/lib/python3.6/site-packages/mms/serving_frontend.py:predict_callback:475] Response is text.
[INFO 2017-12-27 17:41:37,035 PID:90860 /usr/local/lib/python3.6/site-packages/mms/request_handler/flask_handler.py:jsonify:156] Jsonifying the response: {'prediction': [('motorbike', 270, 877, 1944, 3214), ('car', 77, 763, 2113, 3193)]}
Apache NiFi Results of the Run One of the Images Processed Apache NiFi Flow Template mxnetserver.xml
Resources
https://github.com/awslabs/mxnet-model-server
https://github.com/awslabs/mxnet-model-server/blob/master/docs/README.md
https://github.com/awslabs/mxnet-model-server/blob/master/docs/server.md
https://github.com/awslabs/mxnet-model-server/blob/master/examples/ssd/README.md
http://gluon.mxnet.io/
https://github.com/awslabs/mxnet-model-server/blob/master/docs/model_zoo.md
https://mxnet.incubator.apache.org/gluon/index.html
http://mxnet.incubator.apache.org/tutorials/index.html https://github.com/apache/incubator-mxnet/tree/master/example
https://github.com/apache/incubator-mxnet/tree/master/example#deep-learning-examples
http://mxnet.incubator.apache.org/model_zoo/index.html
https://github.com/apache/incubator-mxnet/releases/tag/1.0.0 https://github.com/gluon-api/gluon-api
http://mxnet.incubator.apache.org/tutorials/r/classifyRealImageWithPretrainedModel.html
https://www.slideshare.net/JulienSIMON5/an-introduction-to-deep-learning-with-apache-mxnet
https://mxnet.incubator.apache.org/get_started/why_mxnet.html
https://www.slideshare.net/JulienSIMON5/deep-learning-for-developers-december-2017
https://aws.amazon.com/blogs/aws/aws-contributes-to-milestone-1-0-release-and-adds-model-serving-capability-for-apache-mxnet/
... View more
Labels:
12-27-2017
07:11 PM
This is crazy easy. Use ListFile -> FetchFile
... View more
12-27-2017
01:47 AM
1 Kudo
1. Change the amount and delay of the merge. 2. You can add an Enforce Order processor (only one primary node) 3. Make all connections FirstInFirstOutPrioritizer
... View more
12-26-2017
10:05 PM
3 Kudos
I wanted to see offer alternatives to running Deep Learning and Machine Learning locally and take advantage of some free hours of cloud time.
It seems that it is trivially easy to integrate calling IBM Watson APIs for all your microservice needs. Apache NiFi makes it super easy and fun.
Using out of the box processors, we can call the InvokeHTTP
POST and GET REST APIs of the IBM Watson Platform for Natural Language, Visual Recognition, Personality Analysis, Language Translation and others. These use SSL, so you have to setup a simple StandardSSLContextService in Apache NiFi. Once you know the JVM you are running Apache NiFi with, it's trivial to grab the requirements for that.
By default the password is the very secure,
changeit.
A Get REST CALL to NLP
A Post to a Watson REST URL
Some URLs and Calls
curl -X POST --form "images_file=@mypic.jpg" "https://gateway-a.watsonplatform.net/visual-recognition/api/v3/classify?api_key={api-key}&version=2016-05-20"
curl -X POST --form "images_file=@myotherpic.jpg" "https://gateway-a.watsonplatform.net/visual-recognition/api/v3/detect_faces?api_key={api-key}&version=2016-05-20"
curl --user "{username}":"{password}" "https://gateway.watsonplatform.net/natural-language-understanding/api/v1/analyze?version=2017-02-27&text=This+is+a+test&features=sentiment,keywords"
curl --user "{username}":"{password}" "https://gateway.watsonplatform.net/natural-language-understanding/api/v1/analyze?version=2017-02-27&text=Test&features=sentiment,keywords&keywords.sentiment=true"
curl -X POST --user "{username}":"{password}" --header "Content-Type: application/json" --data-binary @{path_to_file}tone.json "https://gateway.watsonplatform.net/tone-analyzer/api/v3/tone?version=2017-09-21"
curl -X POST --user "{username}":"{password}" --header "Content-Type: application/json" --data-binary @{path_to_file}tone.json "https://gateway.watsonplatform.net/tone-analyzer/api/v3/tone?version=2017-09-21&sentences=false" curl -X GET --user "{username}":"{password}" "https://gateway.watsonplatform.net/tone-analyzer/api/v3/tone?version=2017-09-21
&text=Team%2C%20I%20know%20that%20times%20are%20tough%21%20Product%20sales%20have
%20been%20disappointing%20for%20the%20past%20three%20quarters.%20We%20have%20a%20
competitive%20product%2C%20but%20we%20need%20to%20do%20a%20better%20job%20of%20
selling%20it%21"
curl -X POST --user "{username}":"{password}" --header "Content-Type: application/json" --data-binary @{path_to_file}tone-chat.json "https://gateway.watsonplatform.net/tone-analyzer/api/v3/tone_chat?version=2017-09-21"
Personalality Insights
curl -X POST --user {username}:{password} --header "Content-Type: text/plain;charset=utf-8" --data-binary "@{path_to_file}profile.txt" "https://gateway.watsonplatform.net/personality-insights/api/v3/profile?version=2016-10-20"
Conversation
{
"url": "https://gateway.watsonplatform.net/conversation/api",
"username": "user",
"password": "pass"
}
Discovery
{
"url": "https://gateway.watsonplatform.net/discovery/api",
"username": "u",
"password": "p"
}
curl -X POST -u "{username}":"{password}" -H "Content-Type: application/json" -d '{ "name":"my-first-environment", "description":"exploring environments"}' "api/v1/environments?version=2017-09-01"
Language Translator
{
"url": "https://gateway.watsonplatform.net/language-translator/api",
"username": "u",
"password": "p"
}
curl -X POST --user "{username}":"{password}" --header "Content-Type: application/json" --header "Accept: application/json" --data '{"text":"Hello, world!","source":"en","target":"es"}' "https://gateway.watsonplatform.net/language-translator/api/v2/translate"
Natural Language
{
"url": "https://gateway.watsonplatform.net/natural-language-understanding/api",
"username": "u",
"password": "p"
}
curl --user "{username}":"{password}" "https://gateway.watsonplatform.net/natural-language-understanding/api/v1/analyze?version=2017-02-27&text=SomeText&features=sentiment,keywords"
If you can call it with curl or wget, you can call it with Apache NiFi.
An Overview of Incorporating IBM Watson in Apache NiFi Flows
With Apache NiFi we can subtitute any message you want NLP to analyze. See
${msg} which is expression language.
As you can see just plug your key in there!
IBM Returns some nice clean JSON.
We get our JSON probabilities and can use them as we see fit. The next step I would convert using a schema to Apache AVRO and then to Apache ORC and store in Apache Hive LLAP for queries and analytics.
'
Resources: https://console.bluemix.net/docs/services/visual-recognition/getting-started.html#getting-started-tutorial https://console.bluemix.net/services/natural-language-understanding/ https://console.bluemix.net/services/personality_insights/ https://console.bluemix.net/docs/services/visual-recognition/getting-started.html#getting-started-tutorial https://www.ibm.com/watson/developercloud/visual-recognition/api/v3/#introduction pip install --upgrade watson-developer-cloud https://github.com/watson-developer-cloud/ https://www.ibm.com/watson/webinars/ https://github.com/watson-developer-cloud/retrieve-and-rank-java https://github.com/watson-developer-cloud/java-sdk https://github.com/watson-developer-cloud/java-sdk/tree/develop/speech-to-text https://github.com/watson-developer-cloud/cognitive-client-java
brew tap watson-developer-cloud/tools https://github.com/watson-developer-cloud/java-sdk https://github.com/watson-developer-cloud/java-sdk/tree/master/examples/src/main/java/com/ibm/watson/developer_cloud https://console.bluemix.net/dashboard/apps https://console.bluemix.net/catalog/services/tone-analyzer/
... View more
Labels:
12-26-2017
07:28 PM
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-mqtt-nar/1.4.0/org.apache.nifi.processors.mqtt.ConsumeMQTT/index.html You might want to see it equal to % I am not sure how that could cause an error to have a client id? I have a few dozen MQTT clients and servers running with no issues, you are grabbing all the topics. I have use cloud, local and server brokers. Can you use Mosquitto? That one is really solid. The client ID should not affect connection issues. I have run the MQTT broker on the same machine or different machines. I have run them on Raspberry Pi, OSX, Centos and Ubuntu. All times have worked fine. Client ID should not be important. What errors do you get? You could fork the ConsumeMQTT code and make it a custom processor without a client name. I don't think that's a good idea. I am not sure if the underlying Java library would support that. Can you consume MQTT messages in Python or Java (or something else) on the same machine with no client ID without issue? Apache NiFi uses the Paho Client like most places. import org.eclipse.paho.client.mqttv3.MqttCallback; import org.eclipse.paho.client.mqttv3.MqttException; import org.eclipse.paho.client.mqttv3.MqttMessage; import org.eclipse.paho.client.mqttv3.IMqttDeliveryToken; https://www.eclipse.org/paho/clients/java/ https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-mqtt-bundle/nifi-mqtt-processors/src/main/java/org/apache/nifi/processors/mqtt/ConsumeMQTT.java
... View more
12-24-2017
03:17 PM
4 Kudos
Happy Holidays!
Apache NiFi makes it easy to build your own integration tests. So I am generating tests to test Turning On and Off My Christmas Tree Hat. I also testing taking a picture.
My use case is to send an HTTP message to trigger a Raspberry Pi to turn on a physical device like a camera or light. This is pretty cool and secure with Apache MiniFi and Apache NiFi. A little Python script is all the code and that's basic example code.
This code is a modified TensorFlow classify.py that adds turning on the Christmas Tree. So we turn on the tree and then take a picture with the PiCamera and then run it through a Tensorflow classifier.
root@vid5:/opt/demo# cat classifytree.py
# Copyright 2015 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Simple image classification with Inception.
Run image classification with Inception trained on ImageNet 2012 Challenge data
set.
This program creates a graph from a saved GraphDef protocol buffer,
and runs inference on an input JPEG image. It outputs human readable
strings of the top 5 predictions along with their probabilities.
Change the --image_file argument to any jpg image to compute a
classification of that image.
Please see the tutorial and website for a detailed description of how
to use this script to perform image recognition.
https://tensorflow.org/tutorials/image_recognition/
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import argparse
import os.path
import re
import sys
import tarfile
import os
import datetime
import math
import random, string
import base64
import json
import time
import picamera
from time import sleep
from time import gmtime, strftime
import numpy as np
from six.moves import urllib
import tensorflow as tf
from gpiozero import LEDBoard
from gpiozero.tools import random_values
from signal import pause
tree = LEDBoard(*range(2,28),pwm=True)
for led in tree:
led.source_delay = 0.1
led.source = random_values()
tf.logging.set_verbosity(tf.logging.ERROR)
FLAGS = None
# pylint: disable=line-too-long
DATA_URL = 'http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz'
# pylint: enable=line-too-long
# yyyy-mm-dd hh:mm:ss
currenttime= strftime("%Y-%m-%d %H:%M:%S",gmtime())
host = os.uname()[1]
def randomword(length):
return ''.join(random.choice(string.lowercase) for i in range(length))
class NodeLookup(object):
"""Converts integer node ID's to human readable labels."""
def __init__(self,
label_lookup_path=None,
uid_lookup_path=None):
if not label_lookup_path:
label_lookup_path = os.path.join(
FLAGS.model_dir, 'imagenet_2012_challenge_label_map_proto.pbtxt')
if not uid_lookup_path:
uid_lookup_path = os.path.join(
FLAGS.model_dir, 'imagenet_synset_to_human_label_map.txt')
self.node_lookup = self.load(label_lookup_path, uid_lookup_path)
def load(self, label_lookup_path, uid_lookup_path):
"""Loads a human readable English name for each softmax node.
Args:
label_lookup_path: string UID to integer node ID.
uid_lookup_path: string UID to human-readable string.
Returns:
dict from integer node ID to human-readable string.
"""
if not tf.gfile.Exists(uid_lookup_path):
tf.logging.fatal('File does not exist %s', uid_lookup_path)
if not tf.gfile.Exists(label_lookup_path):
tf.logging.fatal('File does not exist %s', label_lookup_path)
# Loads mapping from string UID to human-readable string
proto_as_ascii_lines = tf.gfile.GFile(uid_lookup_path).readlines()
uid_to_human = {}
p = re.compile(r'[n\d]*[ \S,]*')
for line in proto_as_ascii_lines:
parsed_items = p.findall(line)
uid = parsed_items[0]
human_string = parsed_items[2]
uid_to_human[uid] = human_string
# Loads mapping from string UID to integer node ID.
node_id_to_uid = {}
proto_as_ascii = tf.gfile.GFile(label_lookup_path).readlines()
for line in proto_as_ascii:
if line.startswith(' target_class:'):
target_class = int(line.split(': ')[1])
if line.startswith(' target_class_string:'):
target_class_string = line.split(': ')[1]
node_id_to_uid[target_class] = target_class_string[1:-2]
# Loads the final mapping of integer node ID to human-readable string
node_id_to_name = {}
for key, val in node_id_to_uid.items():
if val not in uid_to_human:
tf.logging.fatal('Failed to locate: %s', val)
name = uid_to_human[val]
node_id_to_name[key] = name
return node_id_to_name
def id_to_string(self, node_id):
if node_id not in self.node_lookup:
return ''
return self.node_lookup[node_id]
def create_graph():
"""Creates a graph from saved GraphDef file and returns a saver."""
# Creates graph from saved graph_def.pb.
with tf.gfile.FastGFile(os.path.join(
FLAGS.model_dir, 'classify_image_graph_def.pb'), 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
_ = tf.import_graph_def(graph_def, name='')
def run_inference_on_image(image):
"""Runs inference on an image.
Args:
image: Image file name.
Returns:
Nothing
"""
if not tf.gfile.Exists(image):
tf.logging.fatal('File does not exist %s', image)
image_data = tf.gfile.FastGFile(image, 'rb').read()
# Creates graph from saved GraphDef.
create_graph()
with tf.Session() as sess:
# Some useful tensors:
# 'softmax:0': A tensor containing the normalized prediction across
# 1000 labels.
# 'pool_3:0': A tensor containing the next-to-last layer containing 2048
# float description of the image.
# 'DecodeJpeg/contents:0': A tensor containing a string providing JPEG
# encoding of the image.
# Runs the softmax tensor by feeding the image_data as input to the graph.
softmax_tensor = sess.graph.get_tensor_by_name('softmax:0')
predictions = sess.run(softmax_tensor,
{'DecodeJpeg/contents:0': image_data})
predictions = np.squeeze(predictions)
# Creates node ID --> English string lookup.
node_lookup = NodeLookup()
top_k = predictions.argsort()[-FLAGS.num_top_predictions:][::-1]
row = []
for node_id in top_k:
human_string = node_lookup.id_to_string(node_id)
score = predictions[node_id]
row.append( { 'node_id': node_id, 'image': image, 'host': host, 'ts': currenttime, 'human_string': str(human_string), 'score': str(score)} )
json_string = json.dumps(row)
print( json_string )
def maybe_download_and_extract():
"""Download and extract model tar file."""
dest_directory = FLAGS.model_dir
if not os.path.exists(dest_directory):
os.makedirs(dest_directory)
filename = DATA_URL.split('/')[-1]
filepath = os.path.join(dest_directory, filename)
if not os.path.exists(filepath):
def _progress(count, block_size, total_size):
sys.stdout.write('\r>> Downloading %s %.1f%%' % (
filename, float(count * block_size) / float(total_size) * 100.0))
sys.stdout.flush()
filepath, _ = urllib.request.urlretrieve(DATA_URL, filepath, _progress)
print()
statinfo = os.stat(filepath)
print('Successfully downloaded', filename, statinfo.st_size, 'bytes.')
tarfile.open(filepath, 'r:gz').extractall(dest_directory)
def main(_):
maybe_download_and_extract()
# Create unique image name
img_name = '/opt/demo/images/pi_image_{0}_{1}.jpg'.format(randomword(3),strftime("%Y%m%d%H%M%S",gmtime()))
# Capture Image from Pi Camera
try:
camera = picamera.PiCamera()
camera.resolution = (1024,768)
camera.annotate_text = " Stored with Apache NiFi "
camera.capture(img_name, resize=(600,400))
pass
finally:
camera.close()
# image = (FLAGS.image_file if FLAGS.image_file else
# os.path.join(FLAGS.model_dir, 'cropped_panda.jpg'))
run_inference_on_image(img_name)
if __name__ == '__main__':
parser = argparse.ArgumentParser()
# classify_image_graph_def.pb:
# Binary representation of the GraphDef protocol buffer.
# imagenet_synset_to_human_label_map.txt:
# Map from synset ID to a human readable string.
# imagenet_2012_challenge_label_map_proto.pbtxt:
# Text representation of a protocol buffer mapping a label to synset ID.
parser.add_argument(
'--model_dir',
type=str,
default='/tmp/imagenet',
help=""" Path to classify_image_graph_def.pb,
imagenet_synset_to_human_label_map.txt, and
imagenet_2012_challenge_label_map_proto.pbtxt. """
)
parser.add_argument(
'--image_file',
type=str,
default='',
help='Absolute path to image file.'
)
parser.add_argument(
'--num_top_predictions',
type=int,
default=5,
help='Display this many predictions.'
)
FLAGS, unparsed = parser.parse_known_args()
tf.app.run(main=main, argv=[sys.argv[0]] + unparsed)
<br>
This is the information for getting your own Christmas Tree Hat for your RPI.
https://thepihut.com/products/3d-xmas-tree-for-raspberry-pi
https://thepihut.com/blogs/raspberry-pi-tutorials/3d-xmas-tree-for-raspberry-pi-assembly-instructions
sudo apt-get install python-gpiozero python3-gpiozero
Results of Running
[{"image": "/opt/demo/images/pi_image_bey_20171218140347.jpg", "ts": "2017-12-18 14:03:32", "host": "vid5", "score": "0.175653", "human_string": "pay-phone, pay-station", "node_id": 843}, {"image": "/opt/demo/images/pi_image_bey_20171218140347.jpg", "ts": "2017-12-18 14:03:32", "host": "vid5", "score": "0.0890657", "human_string": "cellular telephone, cellular phone, cellphone, cell, mobile phone", "node_id": 914}, {"image": "/opt/demo/images/pi_image_bey_20171218140347.jpg", "ts": "2017-12-18 14:03:32", "host": "vid5", "score": "0.0631831", "human_string": "vending machine", "node_id": 558}, {"image": "/opt/demo/images/pi_image_bey_20171218140347.jpg", "ts": "2017-12-18 14:03:32", "host": "vid5", "score": "0.0541551", "human_string": "abacus", "node_id": 547}, {"image": "/opt/demo/images/pi_image_bey_20171218140347.jpg", "ts": "2017-12-18 14:03:32", "host": "vid5", "score": "0.0417486", "human_string": "rotisserie", "node_id": 663}]
To Remote Active the Tree
curl -X POST http://192.168.1.167:8033/contentListener --data-ascii "tree-on" -v
It's so easy to enable Apache MIniFi to be controlled by any remote HTTP request.
Other Apache MiniFi Requests
root@vid5:/opt/demo/minifi-0.2.0/logs# curl -v http://HW13125.local:8080/nifi-api/system-diagnostics
* Hostname was NOT found in DNS cache
* Trying 192.168.1.193...
* Connected to HW13125.local (192.168.1.193) port 8080 (#0)
> GET /nifi-api/system-diagnostics HTTP/1.1
> User-Agent: curl/7.38.0
> Host: HW13125.local:8080
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Mon, 18 Dec 2017 13:42:07 GMT
< X-Frame-Options: SAMEORIGIN
< Cache-Control: private, no-cache, no-store, no-transform
< Content-Type: application/json
< Vary: Accept-Encoding
< Vary: User-Agent
< Content-Length: 1852
* Server Jetty(9.4.3.v20170317) is not blacklisted
< Server: Jetty(9.4.3.v20170317)
<
{"systemDiagnostics":{"aggregateSnapshot":{"totalNonHeap":"390.23 MB","totalNonHeapBytes":409190400,"usedNonHeap":"370.09 MB","usedNonHeapBytes":388065576,"freeNonHeap":"20.15 MB","freeNonHeapBytes":21124824,"maxNonHeap":"-1 bytes","maxNonHeapBytes":-1,"totalHeap":"2 GB","totalHeapBytes":2147483648,"usedHeap":"1.77 GB","usedHeapBytes":1904638968,"freeHeap":"231.59 MB","freeHeapBytes":242844680,"maxHeap":"2 GB","maxHeapBytes":2147483648,"heapUtilization":"89.0%","availableProcessors":8,"processorLoadAverage":2.794921875,"totalThreads":105,"daemonThreads":41,"uptime":"43:00:01.210","flowFileRepositoryStorageUsage":{"freeSpace":"55.76 GB","totalSpace":"931.19 GB","usedSpace":"875.43 GB","freeSpaceBytes":59870846976,"totalSpaceBytes":999860912128,"usedSpaceBytes":939990065152,"utilization":"94.0%"},"contentRepositoryStorageUsage":[{"identifier":"default","freeSpace":"55.76 GB","totalSpace":"931.19 GB","usedSpace":"875.43 GB","freeSpaceBytes":59870846976,"totalSpaceBytes":999860912128,"usedSpaceBytes":939990065152* Connection #0 to host HW13125.local left intact
,"utilization":"94.0%"}],"provenanceRepositoryStorageUsage":[{"identifier":"default","freeSpace":"55.76 GB","totalSpace":"931.19 GB","usedSpace":"875.43 GB","freeSpaceBytes":59870846976,"totalSpaceBytes":999860912128,"usedSpaceBytes":939990065152,"utilization":"94.0%"}],"garbageCollection":[{"name":"G1 Young Generation","collectionCount":742,"collectionTime":"00:00:17.754","collectionMillis":17754},{"name":"G1 Old Generation","collectionCount":0,"collectionTime":"00:00:00.000","collectionMillis":0}],"statsLastRefreshed":"08:42:07 EST","versionInfo":{"niFiVersion":"1.5.0-SNAPSHOT","javaVendor":"Oracle Corporation","javaVersion":"1.8.0_121","osName":"Mac OS X","osVersion":"10.13.2","osArchitecture":"x86_64","buildTag":"HEAD","buildRevision":"a774f1d","buildBranch":"master","buildTimestamp":"12/07/2017 13:37:07 EST"}}}}root@vid5:/opt/d
Resources
https://community.hortonworks.com/articles/118132/minifi-capturing-converting-tensorflow-inception-t.html
https://github.com/tspannhw/rpi-minifi-movidius-sensehat
https://github.com/tspannhw/rpi-sensehat-minifi-python
I will keep my eyes out for Raspberry PI add-ons for other holidays.
For the second christmas tree it's a Sense Hat!
1. Setup Raspian Stretch
https://www.raspberrypi.org/documentation/configuration/wireless/wireless-cli.md
2. Sense-Hat
sudo apt-get install sense-hat
sudo apt-get install octave -y
pip install --upgrade sense-hat
pip install --upgrade pillow
pip install rtimulib
pip install psutil
sudo apt-get install oracle-java8-jdk
sudo apt install gstreamer-1.0
sudo apt install python3-gst-1.0
sudo apt-get install gir1.2-gstreamer-1.0
sudo apt-get install gir1.2-gst-plugins-base-1.0
For the Sense Hat
Just run this: https://github.com/PixelNoob/sensehat/blob/master/xmas_tree.py
... View more
Labels:
12-23-2017
06:48 PM
5 Kudos
2017 in Review First off, this was an amazing year for Big Data, IoT, Streaming, Machine Learning and Deep Learning. So many cool events, updates, new products, new projects, new libraries and community growth. I've seen a lot of people adopt and grow Big Data and streaming projects from nothing. Using the power of Open Source and the tools made available by Apache, companies are growing with the help of trusted partners and a community of engineers and users. We had three awesome DataWorksSummit (Formerly Hadoop Summit, but now a lot more things from IoT, AI and Streaming). I attended Munich and spoke at Sydney. I missed California, but all the videos and slides were online and I loved those. I spoke at Oracle Code in NYC which was a fun little event. I was surprised to learn that many people never heard of Apache NiFi or how easily you could use it to build real-time dataflows including Deep Learning and Big Data. I got to talk to a lot of interesting people while working the Hortonworks Booth at Strata NYC. Such a huge event, fidget spinners and streaming were the main talk away there. We had a lot of awesome meetups in Princeton and in the NYC and Philadelphia areas. The Princeton Future of Data Group grew to over 750 members! A great community of data scientists, engineers, students, analysts, techies and business thought leaders. I am really proud to be apart of this amazing group. Meetups I got to speak at most of the meetups except when we had special guests. I had some great NY/NJ/Philly team mates co-running the meetup: @milind pandit @Greg Keys. Greg and I also created a North Jersey meetup. November 14th - Enterprise Data at Scale I spoke on IBM DSX, Apache NiFi, Apache Spark, Python, Jupyter and Data Science. We had two excellent IBM resources assisting me fortunately. October 5th - Deep Learning with DeepLearning4J (DL4J). A great talk by my friend from SkyMind. It's nice to see their project get accepted to Eclipse. August 8th - Deep Dive into HDF 3.0 @ Honeywell June 20th - Latest Innovation -Schema Registry and More. @TRAC Intermodal May 16th - Hadoop Tools Overview March 28th - Apache NiFi: Ingesting Enterprise Data at Scale Libraries, SDKs, Tools, Frameworks TensorFlow Apache MXNet NLTK Apache OpenNLP Apache Tika Apache NiFi Custom Processors OpenCV Apache NiFi 1.4 Apache Zeppelin IBM DSX Apache Spark 2.x Apache Hive LLAP Apache HBase with Apache Phoenix Apache ORC Apache Hadoop Hortonworks Schema Registry Hortonworks Streaming Analytics Manager Druid Apache SuperSet - Now in Apache PyTorch Apache Storm - Big Updates Devices Raspberry Pi Zero Wireless Raspberry Pi 3B+ Movidius Nvidia Jetson TX1 Matrix Creator Google AIY Voice Kit Kudrone Christmas Tree Hat Sense Hat Many Cameras and Video Cameras NanoPi Duo Tinkerboard There were a lot of big news this year, https://hortonworks.com/blog/top-hortonworks-blogs-2017/. Apache Hive LLAP became a real production thing and brought Apache Hadoop into the world of EDW completely Open Source. On the Apache Spark front, we past verison 2.0 and Livy became a production standby and became Apache Livy. The JanusGraph database appeared and is quickly becoming the standard for Graphs. Apache Calcite went into so many projects that SQL queries are everywhere including in Apache NiFi. A huge number of interesting software projects arrised including Hortonworks Data Plane, Hortonworks Schema Registry and Hortonworks Streaming Analytics Manager. This was an awesome year for software. Presentations From Talks Available
https://www.slideshare.net/bunkertor/enterprise-data-science-at-scale-princeton-nj-14nov2017 https://www.slideshare.net/bunkertor/realtime-ingesting-and-transforming-sensor-data-social-data-w-nifi-tensorflow https://www.slideshare.net/bunkertor/introduction-to-hdf-30 https://www.slideshare.net/bunkertor/introduction-to-hadoop-76031567 https://www.slideshare.net/bunkertor/apache-nifi-ingesting-enterprise-data-at-scale https://www.slideshare.net/bunkertor/ingesting-drone-data-into-big-data-platforms My HCC Articles of 2017
https://community.hortonworks.com/articles/80412/working-with-airbnbs-superset.html https://community.hortonworks.com/articles/116803/building-a-custom-processor-in-apache-nifi-12-for.html https://community.hortonworks.com/articles/79842/ingesting-osquery-into-apache-phoenix-using-apache.html https://community.hortonworks.com/articles/97062/query-hive-using-python.html https://community.hortonworks.com/articles/79008/using-the-hadoop-attack-library-to-check-your-hado.html https://community.hortonworks.com/articles/81222/adding-stanford-corenlp-to-big-data-pipelines-apac.html https://community.hortonworks.com/articles/81270/adding-stanford-corenlp-to-big-data-pipelines-apac-1.html https://community.hortonworks.com/articles/88404/adding-and-using-hplsql-and-hivemall-with-hive-mac.html https://community.hortonworks.com/articles/149891/handling-hl7-records-and-storing-in-apache-hive-fo.html https://community.hortonworks.com/articles/87632/ingesting-sql-server-tables-into-hive-via-apache-n.html https://community.hortonworks.com/articles/73828/submitting-spark-jobs-from-apache-nifi-using-livy.html https://community.hortonworks.com/articles/76240/using-opennlp-for-identifying-names-from-text.html https://community.hortonworks.com/articles/136024/integrating-nvidia-jetson-tx1-running-tensorrt-int.html https://community.hortonworks.com/articles/136026/integrating-nvidia-jetson-tx1-running-tensorrt-int-1.html https://community.hortonworks.com/articles/136028/integrating-nvidia-jetson-tx1-running-tensorrt-int-2.html https://community.hortonworks.com/articles/136039/integrating-nvidia-jetson-tx1-running-tensorrt-int-3.html https://community.hortonworks.com/articles/150026/hl7-processing-part-3-apache-zeppelin-sql-bi-and-a.html https://community.hortonworks.com/articles/104226/simple-backups-of-hadoop-with-apache-nifi-12.html https://community.hortonworks.com/articles/77609/securing-your-clusters-in-the-public-cloud.html https://community.hortonworks.com/articles/92495/monitor-apache-nifi-with-apache-nifi.html https://community.hortonworks.com/articles/77621/creating-an-email-bot-in-apache-nifi.html https://community.hortonworks.com/articles/80418/open-nlp-example-apache-nifi-processor.html https://community.hortonworks.com/articles/76924/data-processing-pipeline-parsing-pdfs-and-identify.html https://community.hortonworks.com/articles/86801/working-with-s3-compatible-data-stores-via-apache.html https://community.hortonworks.com/articles/101904/part-2-iot-augmenting-gps-data-with-weather.html https://community.hortonworks.com/articles/118148/creating-wordclouds-from-dataflows-with-apache-nif.html https://community.hortonworks.com/articles/121916/controlling-big-data-flows-with-gestures-minifi-ni.html https://community.hortonworks.com/articles/76935/using-sentiment-analysis-and-nlp-tools-with-hdp-25.html https://community.hortonworks.com/articles/87397/steganography-with-apache-nifi-1.html https://community.hortonworks.com/articles/83100/deep-learning-iot-workflows-with-raspberry-pi-mqtt.html https://community.hortonworks.com/articles/154957/converting-json-to-sql-ddl.html https://community.hortonworks.com/articles/81694/extracttext-nifi-custom-processor-powered-by-apach.html https://community.hortonworks.com/articles/92345/store-a-flow-to-disk-and-then-reserialize-it-to-co.html https://community.hortonworks.com/articles/92496/qadcdc-our-how-to-ingest-some-database-tables-to-h.html https://community.hortonworks.com/articles/73811/trigger-sonicpi-music-via-apache-nifi.html https://community.hortonworks.com/articles/99861/ingesting-ibeacon-data-via-ble-to-mqtt-wifi-gatewa.html https://community.hortonworks.com/articles/101679/iot-ingesting-gps-data-from-raspberry-pi-zero-wire.html https://community.hortonworks.com/articles/104255/ingesting-and-testing-jms-data-with-nifi.html https://community.hortonworks.com/articles/89455/ingesting-gps-data-from-onion-omega2-devices-with.html https://community.hortonworks.com/articles/89547/tracking-phone-location-for-android-and-iot-with-o.html https://community.hortonworks.com/articles/107379/minifi-for-image-capture-and-ingestion-from-raspbe.html https://community.hortonworks.com/articles/108947/minifi-for-ble-bluetooth-low-energy-beacon-data-in.html https://community.hortonworks.com/articles/108966/minifi-for-sensor-data-ingest-from-devices.html https://community.hortonworks.com/articles/110469/simple-backup-and-restore-of-hdfs-data-via-hdf-30.html https://community.hortonworks.com/articles/110475/ingesting-sensor-data-from-raspberry-pis-running-r.html https://community.hortonworks.com/articles/118132/minifi-capturing-converting-tensorflow-inception-t.html https://community.hortonworks.com/articles/122077/ingesting-csv-data-and-pushing-it-as-avro-to-kafka.html https://community.hortonworks.com/articles/130814/sensors-and-image-capture-and-deep-learning-analys.html https://community.hortonworks.com/articles/86570/hosting-and-ingesting-data-from-web-pages-desktop.html https://community.hortonworks.com/articles/142686/real-time-ingesting-and-transforming-sensor-and-so.html https://community.hortonworks.com/articles/77988/ingest-remote-camera-images-from-raspberry-pi-via.html https://community.hortonworks.com/articles/108718/ingesting-rdbms-data-as-new-tables-arrive-automagi.html https://community.hortonworks.com/articles/149982/hl7-ingest-part-4-streaming-analytics-manager-and.html https://community.hortonworks.com/articles/149910/handling-hl7-records-part-1-hl7-ingest.html https://community.hortonworks.com/articles/80339/iot-capturing-photos-and-analyzing-the-image-with.html https://community.hortonworks.com/articles/77403/basic-image-processing-and-linux-utilities-as-part.html https://community.hortonworks.com/articles/103863/using-an-asus-tinkerboard-with-tensorflow-and-pyth.html https://community.hortonworks.com/articles/146704/edge-analytics-with-nvidia-jetson-tx1-running-apac.html https://community.hortonworks.com/articles/148730/integrating-apache-spark-2x-jobs-with-apache-nifi.html https://community.hortonworks.com/articles/154760/generating-avro-schemas-and-ensuring-field-names-m.html https://community.hortonworks.com/articles/155326/monitoring-energy-usage-utilizing-apache-nifi-pyth.html My Articles on DZone
https://dzone.com/articles/generating-avro-schemas-and-ensuring-field-names-m https://dzone.com/articles/favorite-tech-of-the-year-early-edition https://dzone.com/articles/integrating-apache-spark-2x-jobs-with-apache-nifi https://dzone.com/articles/using-jolt-in-big-data-streams-to-remove-nulls https://dzone.com/articles/processing-hl7-records https://dzone.com/articles/big-data-is-growing https://dzone.com/articles/ingesting-rdbms-data-as-new-tables-arrive-automagi https://dzone.com/articles/using-websockets-with-apache-nifi https://dzone.com/articles/using-the-new-flick-hat-for-raspberry-pi https://dzone.com/articles/real-time-ingest-and-ai https://dzone.com/articles/tensorflow-for-real-world-applications https://dzone.com/articles/integrating-nvidia-jetson-tx1-running-tensorrt-int https://dzone.com/articles/real-time-tensorflow-camera-analysis-with-sensors https://dzone.com/articles/tensorflow-and-nifi-big-data-ai-sandwich https://dzone.com/articles/minifi-capturing-converting-tensorflow-inception-t https://dzone.com/articles/creating-wordclouds-from-dataflows-with-apache-nif https://dzone.com/articles/building-a-custom-processor-in-apache-nifi-12-for https://dzone.com/articles/data-engineer-as-dj https://dzone.com/articles/how-to-automatically-migrate-all-tables-from-a-dat https://dzone.com/articles/dataworks-summit-2017-sj-updates https://dzone.com/articles/hdf-30-for-utilities https://dzone.com/articles/hdp-26-what-why-how-and-now https://dzone.com/articles/using-apache-minifi-on-edge-devices-part-1 https://dzone.com/articles/creating-an-email-bot-in-apache-nifi https://dzone.com/articles/this-week-in-hadoop-and-more-deep-deep-learning-an https://dzone.com/articles/using-python-for-big-data-workloads-part-2 https://dzone.com/articles/using-tinkerboard-with-tensorflow-and-python https://dzone.com/articles/using-python-for-big-data-workloads-part-1 https://dzone.com/articles/part-2-iot-augmenting-gps-data-with-weather https://dzone.com/articles/this-week-in-hadoop-and-more-apache-calcite-kylin https://dzone.com/articles/iot-ingesting-gps-data-from-raspberry-pi-zero-wire https://dzone.com/articles/a-new-era-of-open-source-streaming https://dzone.com/articles/day-1-dataworks-summit-munich-report https://dzone.com/articles/this-week-in-hadoop-and-more-dl-conferences-course https://dzone.com/articles/advanced-apache-nifi-flow-techniques https://dzone.com/articles/a-big-data-reference-architecture-for-iot https://dzone.com/articles/ingesting-gps-data-from-onion-omega2-devices-with-apache-nifi https://dzone.com/articles/sentiment-shoot-out https://dzone.com/articles/best-of-dataworks-summit-2017-munich https://dzone.com/articles/deep-learning-on-big-data-platforms https://dzone.com/articles/tensorflow-on-the-edge-part-2-of-5 https://dzone.com/articles/this-week-in-hadoop-and-more-nifi-drones-dataworks https://dzone.com/articles/oracle-code-new-york-report https://dzone.com/articles/deep-learning-for-data-engineers-part-1 https://dzone.com/articles/this-week-in-hadoop-and-more-keras-deep-learning-a https://dzone.com/articles/happy-pi-day-2017 https://dzone.com/articles/deep-learning-and-machine-learning-guide-part-iii https://dzone.com/articles/this-week-in-hadoop-and-more-deep-and-machine-lear https://dzone.com/articles/backup-restore-dr https://dzone.com/articles/big-data-performance-part-1 https://dzone.com/articles/nifi-spark-hbase-kafka-machine-learning-and-deep-l https://dzone.com/articles/hadoop-101-hbase-client-access https://dzone.com/articles/deep-learning-and-machine-learning-guide-part-ii https://dzone.com/articles/this-week-in-hadoop-and-more-cloud-visualization-d https://dzone.com/articles/big-data-ml-dl-command-line-tools https://dzone.com/articles/machine-learning-resources https://dzone.com/articles/tensorflow-on-the-edge https://dzone.com/articles/deep-learning-and-machine-learning-killer-tools-li https://dzone.com/articles/cool-projects-big-data-machine-learning-apache-nifi https://dzone.com/articles/protect-your-cloud-big-data-assets https://dzone.com/articles/edge-testing-your-hadoop-environment https://dzone.com/articles/this-week-in-hadoop-and-more-6 https://dzone.com/articles/picamera-ingest-real-time https://dzone.com/articles/this-week-in-hadoop-and-more-nlp-and-dl https://dzone.com/articles/quick-tips-apache-phoenixhbase https://dzone.com/articles/the-physics-of-big-data My RefCard
https://dzone.com/refcardz/introduction-to-tensorflow My Guide https://dzone.com/guides/artificial-intelligence-machine-learning-and-predi My Github Source Code I have some example Apache NiFi custom processors developed in JDK 8 including ones for TensorFlow, OpenNLP, DL4J, Apache Tika, Stanford CoreNLP and more. I also published all the Python scripts, documentation, Shell scripts, SQL, Apache NiFi Templates and Apache Zeppelin notebooks as Apache licensed open source on Github.
https://github.com/tspannhw/nifi-tensorflow-processor https://github.com/tspannhw/nifi-nlp-processor https://github.com/tspannhw/nifi-attributecleaner-processor https://github.com/tspannhw/apachelivy-nifi-spark2-integration https://github.com/tspannhw/nvidiajetsontx1-mxnet https://github.com/tspannhw/nifi-dl4j-processor
https://github.com/tspannhw/dws2017sydney https://github.com/tspannhw/rpi-flickhat-minifi https://github.com/tspannhw/rpi-rainbowhat https://github.com/tspannhw/rpi-sensehat-minifi-python https://github.com/tspannhw/rpizw-nifi-mqtt-gps https://github.com/tspannhw/EnterpriseNIFI https://github.com/tspannhw/IngestingDroneData https://github.com/tspannhw/spy https://github.com/tspannhw/webdataingest https://github.com/tspannhw/mxnet_rpi https://github.com/tspannhw/nifi-extracttext-processor https://github.com/tspannhw/nifi-corenlp-processor https://github.com/tspannhw/nlp-utilities https://github.com/tspannhw/rpi-sensehat-mqtt-nifi https://github.com/tspannhw/rpi-picamera-mqtt-nifi https://github.com/tspannhw/iot-scripts https://github.com/tspannhw/phoenix https://github.com/tspannhw/hive Next year will be amazing, more libraries, more use cases for Deep Learning, enhancements to all the great projects and tools out there. Another Google AIY Kit, more DataWorks Summits, Hadoop 3, HDF 4, HDP 3, so many things to look forward to. See you at meetups, summits and online next year.
... View more
12-22-2017
08:49 PM
5 Kudos
In the Holidays, it's nice to know how much energy you are using. So one small step is I bought a low-end inexpensive TPLink Energy Monitoring plug for one device. I have been monitoring phone charging and my Apple monitor. Let's read the data and do some queries in Apache Hive and Apache Spark 2 SQL. Processing Live Energy Feeds in The Cloud Monitor Energy From a Local OSX If your local instance does not have access to Apache Hive, you will need to send the data via Site-to-Site to a Remote Apache NiFi / HDF server/cluster that can. For Apache Hive Usage, Please Convert to Apache ORC Files To Create Your New Table, Grab the hive.ddl Inside of Apache Zeppelin, we can create our table based on the above DDL. We could have also let Apache NiFi create the table for us. I like to keep my DDL with my notebook. Just a personal choice. We can then query our table in Apache Zeppelin utilizing Apache Spark 2 SQL and Apache Hive QL. Overview Step 1: Purchase an inexpensive energy monitoring plug Step 2: Connect it to a Phone App via WIFI Step 3: Once Configured, you can now access via Python Step 4: Install the HS100 Python Library in Python 3.x Step 5: Fork My Github and Use My Shell Script and Python Script Step 6: Add the Local Apache NiFi Flow which will call that Script Step 7: Add a Remote Apache NiFi Flow for Processing into Apache Hadoop Step 8: Create Your Table Step 9: Query with Apache Hive and Apache Spark SQL via Apache Zeppelin or Other UI Step 10: Turn that extra stuff off and save money! The Open Source Code and Artefacts Shell Script (smartreader.sh) python3 meterreader.py Python Code (meterreader.py) from pyHS100 import SmartPlug, SmartBulb
#from pprint import pformat as pf
import json
import datetime
plug = SmartPlug("192.168.1.200")
row = { }
emeterdaily = plug.get_emeter_daily(year=2017, month=12)
for k, v in emeterdaily.items():
row["hour%s" % k] = v
hwinfo = plug.hw_info
for k, v in hwinfo.items():
row["%s" % k] = v
sysinfo = plug.get_sysinfo()
for k, v in sysinfo.items():
row["%s" % k] = v
timezone = plug.timezone
for k, v in timezone.items():
row["%s" % k] = v
emetermonthly = plug.get_emeter_monthly(year=2017)
for k, v in emetermonthly.items():
row["day%s" % k] = v
realtime = plug.get_emeter_realtime()
for k, v in realtime.items():
row["%s" % k] = v
row['alias'] = plug.alias
row['time'] = plug.time.strftime('%m/%d/%Y %H:%M:%S')
row['ledon'] = plug.led
row['systemtime'] = datetime.datetime.now().strftime('%m/%d/%Y %H:%M:%S')
json_string = json.dumps(row)
print(json_string) The code is basically a small tweak on the example code provided with the pyHS100 code. This code allows you to access the HS110 that I have. My PC and my smart meter are on the same WiFi which can't be 5G. Example Data {"hour19": 0.036, "hour20": 0.021, "hour21": 0.017, "sw_ver": "1.1.1 Build 160725 Rel.164033", "hw_ver": "1.0", "mac": "50:C7:BF:B1:95:D5", "type": "IOT.SMARTPLUGSWITCH", "hwId": "60FF6B258734EA6880E186F8C96DDC61", "fwId": "060BFEA28A8CD1E67146EB5B2B599CC8", "oemId": "FFF22CFF774A0B89F7624BFC6F50D5DE", "dev_name": "Wi-Fi Smart Plug With Energy Monitoring", "model": "HS110(US)", "deviceId": "8006ECB1D454C4428953CB2B34D9292D18A6DB0E", "alias": "Tim Spann's MiniFi Controller SmartPlug - Desk1", "icon_hash": "", "relay_state": 1, "on_time": 161599, "active_mode": "schedule", "feature": "TIM:ENE", "updating": 0, "rssi": -32, "led_off": 0, "latitude": 40.268216, "longitude": -74.529088, "index": 18, "zone_str": "(UTC-05:00) Eastern Daylight Time (US & Canada)", "tz_str": "EST5EDT,M3.2.0,M11.1.0", "dst_offset": 60, "day12": 0.074, "current": 0.04011, "voltage": 122.460974, "power": 1.8772, "total": 0.074, "time": "12/21/2017 13:21:52", "ledon": true, "systemtime": "12/21/2017 13:21:53"} As you can see we only get the hours and days where we had usage. Since this is new, I don't have them all. I created my schema to handle all the days of a month and all the hours of a day. We are going to have a sparse table. If I was monitoring millions of devices, I would put this in Apache HBase. I may do that later. Let's create an HDFS directory for Loading Apache ORC Files hdfs dfs -mkdir -p /smartPlug
hdfs dfs -chmod -R 777 /smartPlug Table DDL CREATE EXTERNAL TABLE IF NOT EXISTS smartPlug (hour19 DOUBLE, hour20 DOUBLE, hour21 DOUBLE, hour22 DOUBLE, hour23 DOUBLE, hour18 DOUBLE, hour17 DOUBLE, hour16 DOUBLE, hour15 DOUBLE, hour14 DOUBLE, hour13 DOUBLE, hour12 DOUBLE, hour11 DOUBLE, hour10 DOUBLE, hour9 DOUBLE, hour8 DOUBLE, hour7 DOUBLE, hour6 DOUBLE, hour5 DOUBLE, hour4 DOUBLE, hour3 DOUBLE, hour2 DOUBLE, hour1 DOUBLE, hour0 DOUBLE, sw_ver STRING, hw_ver STRING, mac STRING, type STRING, hwId STRING, fwId STRING, oemId STRING, dev_name STRING, model STRING, deviceId STRING, alias STRING, icon_hash STRING, relay_state INT, on_time INT, feature STRING, updating INT, rssi INT, led_off INT, latitude DOUBLE, longitude DOUBLE, index INT, zone_str STRING, tz_str STRING, dst_offset INT, day31 DOUBLE, day30 DOUBLE, day29 DOUBLE, day28 DOUBLE, day27 DOUBLE, day26 DOUBLE, day25 DOUBLE, day24 DOUBLE, day23 DOUBLE, day22 DOUBLE, day21 DOUBLE, day20 DOUBLE, day19 DOUBLE, day18 DOUBLE, day17 DOUBLE, day16 DOUBLE, day15 DOUBLE, day14 DOUBLE, day13 DOUBLE, day12 DOUBLE, day11 DOUBLE, day10 DOUBLE, day9 DOUBLE, day8 DOUBLE, day7 DOUBLE, day6 DOUBLE, day5 DOUBLE, day4 DOUBLE, day3 DOUBLE, day2 DOUBLE, day1 DOUBLE, current DOUBLE, voltage DOUBLE, power DOUBLE, total DOUBLE, time STRING, ledon BOOLEAN, systemtime STRING) STORED AS ORC
LOCATION '/smartPlug' A Simple Query on Some of the Variables select `current`,voltage, power,total,time,systemtime, on_time, rssi, latitude, longitude
from smartPlug Note that current is a special word in SQL so we tick it. An Apache Calcite Query Inside Apache NiFi SELECT * FROM FLOWFILE WHERE "current" > 0 With the Python API I can turn it off, so don't monitor then. In an updated article I will add a few smart plugs and turn them on and off based on things occurring. Perhaps turn off a light when no motion detected. We can do anything with Apache NiFi, Apache MiniFi and Python. The API also allows for turning the green LED light on the plug on and off. The Screen Prints above are from the IoS version of the TPLink KASA app, which let's you configure and monitor your plug. For many people that's good enough, but not for me. Resources smartplugprocessing.xml monitorpowerlocal.xml https://github.com/GadgetReactor/pyHS100 https://pypi.python.org/pypi/pyHS100 pip3 install pyhs100 https://github.com/tspannhw/nifi-smartplug/tree/master
... View more
Labels:
12-21-2017
09:48 PM
that did not work. it still gives an error. using executesql worked.
... View more
12-15-2017
02:12 PM
Please post more logs. Inside the DBConntroller you have a long jdbc string link jdbc://db2://myserver:800?allowNextONExhaustedResultSet=1 Post your full log and XML nifi file
... View more