1973
Posts
1225
Kudos Received
124
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1914 | 04-03-2024 06:39 AM | |
| 3011 | 01-12-2024 08:19 AM | |
| 1643 | 12-07-2023 01:49 PM | |
| 2420 | 08-02-2023 07:30 AM | |
| 3361 | 03-29-2023 01:22 PM |
01-02-2018
05:19 PM
1 Kudo
Initial Login to ODroid XU4 GPS Command Line Check WiFi Signal Strength Install Ubuntu Extras Above is our device with GPS The ODroid XU4 is a powerful little inexpensive IoT device. It is similar to a Raspberry PI and compatible with many of the libraries. I chose to install ARMBIAN Ubuntu which is a good OS. It is very easy to add GPS libraries, Apache MiniFi, JDK 8 and Python for processing. Apache NiFi processing flow Apache MiniFi Flow on Device Apache NiFi Processing See: https://community.hortonworks.com/content/kbentry/155604/iot-ingesting-camera-data-from-nanopi-duo-devices.html Source Code https://github.com/tspannhw/odroidxu4-gps-python-minifi Software sudo apt-get update sudo apt-get upgrade sudo apt-get dist-upgrade armbian-config sudo add-apt-repository ppa:webupd8team/java sudo apt-get update sudo apt-get install oracle-java8-installer -y sudo apt-get install oracle-java8-set-default -y blkid swapon /dev/sda1 sudo apt-get install gpsd gpsd-clients sudo dpkg-reconfigure gpsd sudo gpsmon /dev/ttyACM0 sudo apt-get install python-gps cgps -s Resources http://www.hardkernel.com/main/products/prdt_info.php?g_code=G148048570542 https://wiki.odroid.com/odroid-xu4/gettingstart/linux/install_linux https://magazine.odroid.com https://magazine.odroid.com/wp-content/uploads/odroid-xu4-user-manual.pdf#page=6
https://magazine.odroid.com/article/running-yolo-odroid-yolodroid/ http://machinethink.net/blog/object-detection-with-yolo/ https://www.armbian.com/odroid-xu4/ https://docs.armbian.com/ https://docs.armbian.com/User-Guide_Getting-Started/ https://forum.armbian.com/topic/1768-booting-into-command-line-on-opi-pc/ https://docs.armbian.com/User-Guide_Armbian-Config/ https://magazine.odroid.com/wp-content/uploads/ODROID-Magazine-201712.pdf https://magazine.odroid.com/article/running-yolo-odroid-yolodroid/ https://wiki.odroid.com/odroid-xu4/odroid-xu4 https://www.packtpub.com/books/content/working-webcam-and-pi-camera Hardware Armbian Ubuntu 16.04.3 LTS 4.9.61-odroidxu4 Two quad core CPUs and a GPU
2GB LPDDR3
USB 3 Power Supply Powered USB Hub Active Cooling Fan Stratus GPYes USB GPS Device Example Run of GPS Device root@odroidxu4:/opt/demo# ./gps2.sh
[{"utc": "2018-01-02T00:19:25.000Z", "epx": "25.829", "epv": "62.56", "diskfree": "54166.8 MB", "altitude": "41.844", "memory": 7.6, "eps": "1.43", "longitude": "-74.529222193", "ts": "2018-01-02 00:19:23", "ept": "0.005", "host": "odroidxu4", "track": "141.59", "mode": "3", "time": "2018-01-02T00:19:25.000Z", "latitude": "40.268098177", "climb": "-0.042", "ipaddress": "192.168.1.202", "cputemp": 53.0, "speed": "0.189"}]
Software and Device Setup We need to setup a Swap area to deal with not enough RAM for builds. root@odroidxu4:~# sudo blkid
/dev/mmcblk1p1: UUID="7a81b72b-eee2-428f-9706-2aa0bd7e766f" TYPE="ext4" PARTUUID="7cbb190c-01"
/dev/zram7: UUID="794ef788-2696-4f56-97da-8c9e179be92d" TYPE="swap"
/dev/zram0: UUID="99613ecb-d41b-4dbe-a85c-2134800fcd8e" TYPE="swap"
/dev/zram1: UUID="57aaeb7c-b970-456d-a415-3e9151dc5fbf" TYPE="swap"
/dev/zram2: UUID="61a42b3a-e605-49e4-844b-467212228878" TYPE="swap"
/dev/zram3: UUID="7f7539df-3500-489b-990b-929d09896a42" TYPE="swap"
/dev/zram4: UUID="49d3a36a-5150-4c31-8acc-c77c512f8a31" TYPE="swap"
/dev/zram5: UUID="9ae98fa0-507d-415a-95d3-e24e9a4a8b2f" TYPE="swap"
/dev/zram6: UUID="4644d45f-dd3e-4bab-9507-1e28fe4b9d40" TYPE="swap"
/dev/mmcblk1: PTUUID="7cbb190c" PTTYPE="dos"
/dev/sda1: UUID="3C09-B728" TYPE="vfat"
root@odroidxu4:~# sudo mkswap /dev/sda1
mkswap: /dev/sda1: warning: wiping old vfat signature.
Setting up swapspace version 1, size = 114.6 GiB (123010527232 bytes)
no label, UUID=24360a18-0939-47b1-b383-27982d8a2007
root@odroidxu4:~# sudo swapon /dev/sda1
root@odroidxu4:~# sudo swapon
NAME TYPE SIZE USED PRIO
/dev/zram0 partition 124.6M 0B 5
/dev/zram1 partition 124.6M 0B 5
/dev/zram2 partition 124.6M 0B 5
/dev/zram3 partition 124.6M 0B 5
/dev/zram4 partition 124.6M 0B 5
/dev/zram5 partition 124.6M 0B 5
/dev/zram6 partition 124.6M 0B 5
/dev/zram7 partition 124.6M 0B 5
/dev/sda1 partition 114.6G 0B -1
sudo apt-get install gpsd gpsd-clients
sudo gpsmon /dev/ttyACM0
sudo cgps -s
sudo apt-get install python-gps
sudo apt-get install gpsd gpsd-clients
sudo dpkg-reconfigure gpsd
sudo gpsmon /dev/ttyACM0
sudo pip install psutil
sudo pip install --upgrade pip
sudo pip install psutil
___ _ _ _ __ ___ _ _ _
/ _ \ __| |_ __ ___ (_) __| | \ \/ / | | | || |
| | | |/ _` | '__/ _ \| |/ _` | \ /| | | | || |_
| |_| | (_| | | | (_) | | (_| | / \| |_| |__ _|
\___/ \__,_|_| \___/|_|\__,_| /_/\_\___/ |_|
Welcome to ARMBIAN 5.36 user-built Ubuntu 16.04.3 LTS 4.9.61-odroidxu4
System load: 0.00 0.01 0.00 Up time: 10 min
Memory usage: 3 % of 1993MB IP: 192.168.1.202
CPU temp: 58°C
Usage of /: 6% of 57G
[ General system configuration (beta): armbian-config ]
Last login: Mon Jan 1 23:19:21 2018 from 192.168.1.193
pip install --upgrade pip setuptools wheel
pip install -upgrade psutil<br>
... View more
Labels:
01-02-2018
04:58 PM
2 Kudos
This is pretty close to a Raspberry Pi Zero. This inexpensive box is another useful IoT device for ingesting some data very inexpensively. Machine Browsing root@NanoPi-Duo:/home/pi# cpu_freq
INFO: HARDWARE=sun8i
CPU0 online=1 temp=48092 governor=ondemand cur_freq=312000
CPU1 online=1 temp=48092 governor=ondemand cur_freq=312000
CPU2 online=1 temp=48092 governor=ondemand cur_freq=312000
CPU3 online=1 temp=48092 governor=ondemand cur_freq=312000 Source Code for Python and Shell Script https://github.com/tspannhw/nifi-nanopi-duo Hardware The FA-CAM202 is a 200M USB camera. 512MB RAM Ubuntu 16.04.3 LTS 4.11.2 Similar SBC to Raspberry PI ARM Software Setup sudo apt-get install fswebcam -y sudo apt-get install libv4l-dev -y sudo apt-get install python-opencv -y sudo npi-config sudo apt-get update sudo apt-get install libcv-dev libopencv-dev -y pip install psutil pip2 install psutil curl http://192.168.1.193:8080/nifi-api/site-to-site/ -v inferred.avro.schema { "type" : "record", "name" : "NANO", "fields" : [ { "name" : "diskfree", "type" : "string", "doc" : "Type inferred from '\"23329.1 MB\"'" }, { "name" : "cputemp", "type" : "double", "doc" : "Type inferred from '55.0'" }, { "name" : "host", "type" : "string", "doc" : "Type inferred from '\"NanoPi-Duo\"'" }, { "name" : "endtime", "type" : "string", "doc" : "Type inferred from '\"2018-01-04 20:23:56\"'" }, { "name" : "ipaddress", "type" : "string", "doc" : "Type inferred from '\"192.168.1.191\"'" }, { "name" : "h", "type" : "int", "doc" : "Type inferred from '342'" }, { "name" : "ts", "type" : "string", "doc" : "Type inferred from '\"2018-01-04 20:23:43\"'" }, { "name" : "filename", "type" : "string", "doc" : "Type inferred from '\"/opt/demo/images/2018-01-04_2023.jpg.faces.jpg\"'" }, { "name" : "w", "type" : "int", "doc" : "Type inferred from '342'" }, { "name" : "memory", "type" : "double", "doc" : "Type inferred from '60.1'" }, { "name" : "y", "type" : "int", "doc" : "Type inferred from '264'" }, { "name" : "x", "type" : "int", "doc" : "Type inferred from '877'" } ] } hive.ddl CREATE EXTERNAL TABLE IF NOT EXISTS nanopi (diskfree STRING, cputemp DOUBLE, host STRING, endtime STRING, ipaddress STRING, h INT, ts STRING, filename STRING, w INT, memory DOUBLE, y INT, x INT) STORED AS ORC
LOCATION '/nano' Apache NiFi The flow in Apache NiFi is pretty simple. We receive the flowfiles from the remote Apache MiniFi box. 1. RouteOnAttribute: Send images to the file system, continue processing the JSON 2. AttributeCleanerProcessor: Clean up the attributes Not really needed for this dataset. 3. UpdateAttribute: Set the schema name to reference the registry 4. SplitJSON: Split JSON into one array per flowfile 5. ConvertRecord: Convert JSON Tree to AVRO with embedded schema 6. ConvertAvroToORC: Build an Apache ORC file (we could add a step before for MergeContent) 7. PutHDFS: Store in Hadoop File System forever Apache MiniFi We have a simple flow in Apache MiniFi. 1. ExecuteProcess: Run a shell script to grab a timestamp filenamed image from the USB webcam. Then call a Python script that does OpenCV Face Detection and adds some local variables to a JSON array with facial squares. 2. GetFile: retrieve all the images from the box and send them to Apache NiFi. Example Data [{"diskfree": "23445.5 MB", "cputemp": 56.7, "host": "NanoPi-Duo", "endtime": "2018-01-04 17:40:30", "ipaddress": "192.168.1.191", "h": 55, "ts": "2018-01-04 17:40:20", "filename": "/opt/demo/images/2018-01-04_1740.jpg.faces.jpg", "w": 55, "memory": 22.6, "y": 471, "x": 270}, {"diskfree": "23445.5 MB", "cputemp": 56.7, "host": "NanoPi-Duo", "endtime": "2018-01-04 17:40:30", "ipaddress": "192.168.1.191", "h": 67, "ts": "2018-01-04 17:40:20", "filename": "/opt/demo/images/2018-01-04_1740.jpg.faces.jpg", "w": 67, "memory": 22.6, "y": 625, "x": 464}] Resources http://wiki.friendlyarm.com/wiki/index.php/NanoPi_Duo https://diyprojects.io/orange-pi-armbian-control-camera-python-opencv/#.Wku2kFQ-f1J https://www.armbian.com/nanopi-duo/ https://github.com/opencv/opencv.git https://github.com/informramiz/Face-Detection-OpenCV https://realpython.com/blog/python/face-detection-in-python-using-a-webcam/ https://realpython.com/blog/python/face-recognition-with-python/ Face
... View more
Labels:
01-02-2018
04:22 PM
Switch to HTTP, make sure everything has permissions. check firewalls. I have had wrong input names sometimes in the config.yml check those.
... View more
01-01-2018
11:43 PM
anything else showing in Ambari? is spark2 running? any other issues?
... View more
12-30-2017
12:05 AM
3 Kudos
Oracle -> GoldenGate -> Apache Kafka -> Apache NiFi / Hortonworks Schema Registry -> JDBC Database
Sometimes you need to process any number of table changes sent from tools via Apache Kafka. As long as they have proper header data and records in JSON, it's really easy in Apache NiFi.
Requirements:
Process Each Partition Separately
Process Records in Order as each message is an Insert, Update or Delete to an existing table in our receiving JDBC store.
Re-process if data lost
For The Main Processor for Routing, It must only run on the Primary Node.
Enforcing Order
We use the Kafka.Offset to order the records, which makes sense in Apache Kafka topics.
After Insert, Update, Delete queries are built, let's confirm and enforce that strict ordering.
To further confirm processing in order, we make each connection in the flow FirstInFirstOutPrioritizer.
To Route, We Route Each Partition to A Different Processor Group (One Local, The Other Remote)
Let's Store Some Data in HDFS for each Table
Connect To Kafka and Grab From our Topic
Let's Connect to our JDBC Store
Let's do an Update (Table Name is Dynamic)
The Jolt Processor has an awesome tester for trying out Jolt
Make sure we connect our remote partitions
Routing From Routing Server (Primary Node)
For Processing Partition 0 (Run on the Routing Server)
We infer the schema with our InferAvroSchema, so we don't need to know the embedded table layouts before a record arrives. In production it makes sense to know all these in advance and do integration tests and versioning of schemas. This is where Hortonworks Scheme Registry is awesome. We name the avro record after the table dynamically. We can get and store permanent schema in the Hortonworks Schema Registry.
Process The Next Partition 1 .. (We can have one server or cluster per partition)
Process the Partition 1 Kafka Records from the Topic
This Flow Will Convert Our Embedded JSON Table Record into New SQL
Input: {"ID":2001,"GD":"F","DPTID":2,"FIRSTNAME":"Tim","LAST":"Spann"}
Output: INSERT INTO THETABLE (ID, GD, DPTID, FIRSTNAME, LAST) VALUES (?, ?, ?, ?, ?)
sql.args.5.value Spann
sql.table THETABLE
With all the field being parameters for a SQL Injection safe parameter based insert, update or delete based on control sent.
Golden Gate Messages
{"table": "SCHEMA1.TABLE7","op_type": "I","op_ts": "2017-11-01 04:31:56.000000","current_ts": "2017-11-01T04:32:04.754000","pos": "00000000310000020884","after": {"ID":1,"CODE": "B","NAME":"STUFF","DESCR" :"Department","ACTIVE":1}}
Using a simple EvaluateJsonPath we pull out these control fields, example: $.before.
The Table Name for ConvertJSONtoSQL: ${table:substringAfter('.')}. This is to remove all leading schema / tablespace name. From the drop down for each of the three we pick either UPDATE, INSERT or DELETE based on the op_type.
We follow this with a PutSQL which will execute on our destination JDBC database sink.
After that I collect all the attributes convert them to a JSON flowfile and save that to HDFS for logging and reporting. This step could be skipped or could be in another format or sent elsewhere.
Control Fields
pos: position
table: table to update in the data warehouse
current_ts: time stamp
op_ts: time stamp
op_type: operation type (I – insert, U- update, D – delete)
Important Apache NiFi System Fields
kafka.offset
kafka.partition
kafka.topic
We can Route and process these for special handling.
To Create HDFS Directories for Changes
su hdfs <br>hdfs dfs -mkdir -p /new/T1 <br>hdfs dfs -mkdir -p /new/T2 <br>hdfs dfs -mkdir -p /poc/T3
hdfs dfs -chmod -R 777 /new <br>hdfs dfs -ls -R /new
To Create a Test Apache Kafka Topic
./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 2 --topic goldengate
Creating a MYSQL Database As Recipient JDBC Server
wget https://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.45.tar.gz
mysql
create database mydw;
CREATE USER 'nifi'@'%' IDENTIFIED BY 'MyPassWordIsSoAwesome!!!!';
GRANT ALL PRIVILEGES ON *.* TO 'nifi'@'%' WITH GRANT OPTION;
commit;
SHOW GRANTS FOR 'nifi'@'%';
#Create some tables in the database for your records.
create table ALOG (
AID VARCHAR(1),
TIMESEC INT,
SOMEVAL VARCHAR(255),
PRIMARY KEY (AID, TIMESEC)
);
Jolt Filter
Attribute: afterJolt
${op_type:equalsIgnoreCase("D"):ifElse("none", "after")}
Attribute: beforeJolt
${op_type:equalsIgnoreCase("D"):ifElse("before", "none")}
Jolt Script to Transform JSON
[ {
"operation": "shift",
"spec": {
"${beforeJolt}": {
"*": "&"
},
"${afterJolt}": {
"*": "&"
}
}
}, {
"operation": "shift",
"spec": {
"*": "&"
}
} ]
Primary Node Flow Template
primarynode.xml
Partition X Node Flow Template
remotenode.xml
References:
https://www.xenonstack.com/blog/data-ingestion-using-apache-nifi-for-building-data-lakes-twitter-data
https://community.hortonworks.com/questions/107536/nifi-uneven-distribution-consumekafka.html
https://community.hortonworks.com/articles/57262/integrating-apache-nifi-and-apache-kafka.html
https://community.hortonworks.com/articles/80284/hdf-2x-adding-a-new-nifi-node-to-an-existing-secur.html
https://dev.mysql.com/downloads/connector/j/
... View more
Labels:
12-29-2017
07:23 PM
2 Kudos
Processing Voice Data Using a Raspberry Pi 3 with Google AIY Voice Kit is an easy way to control data flows in an organization. The AIY Voice Kit is a good way to prototype voice ingest. It is not perfect, but for low end inexpensive hardware it is a good solution for those that speak clearly and are willing to repeat a phrase a few times for activation. I was able to easily add a word for it to react to. I also have it waiting for someone to press the button to activate. Setup and Run Steps 1. Install box 2. Setup GCP Account 3. As Pi User
~/bin/AIY-voice-kit-shell.sh
src/assistant_library_demo.py 4. Download and Unzip Apache MiniFi 5. Custom Action
controlx 6. Run it
pi@raspberrypi:~/AIY-voice-kit-python/src $ ./controlx.py 7. start minifi 8. Apache MiniFi will tail the command file and look for audio files Code Added to Google Example elif event.type == EventType.ON_RECOGNIZING_SPEECH_FINISHED and event.args: print('You said:', event.args['text']) text = event.args['text'].lower() f=open("/opt/demo/commands.txt", "a+") f.write(text) f.close() if 'process' in text : self._assistant.stop_conversation() self._say_ip() elif 'computer' in text: self._assistant.stop_conversation() self._say_ip() elif text == 'ip address' : self._assistant.stop_conversation() self._say_ip() def _say_ip(self): ip_address = subprocess.check_output("hostname -I | cut -d' ' -f1", shell=True) f=open("/opt/demo/commands.txt", "a+") f.write('My IP address is %s' % ip_address.decode('utf-8')) f.close() # aiy.audio.say('My IP address is %s' % ip_address.decode('utf-8')) #aiy.audio.say('Turning on LED') #voice_name = '/opt/demo/voices/voice_{0}.wav'.format(strftime("%Y%m%d%H%M%S",gmtime())) #aiy.audio.record_to_wave(voice_name, 5) aiy.voicehat.get_led().set_state(aiy.voicehat.LED.BLINK) aiy.audio.say('MiniFy') print('My IP address is %s' % ip_address.decode('utf-8')) Apache NiFi Ingest Flow Flow File for a Sound File Flow File For Commands Apache MiniFi Ingest Apache NiFi Overview Text Command Processing AIYVOICE Schema in Hortonworks Schema Registry Source Code: https://github.com/tspannhw/nifi-googleaiy Results: An example of text deciphered from my speaking into the AIY Voice Kit. what was the process that we could try that a different way right now when was text put it in the file My IP address is 192.168.1.199 The Recorded Audio Files Sent ../images/voice_20171229195641.wav ../images/voice_20171229202456.wav ../images/voice_20171229203028.wav Inferred Schema inferred.avro.schema
{ "type" : "record", "name" : "aiyvoice", "fields" : [ { "name" : "schema", "type" : "string", "doc" : "Type inferred from '\"aiyvoice\"'" }, { "name" : "path", "type" : "string", "doc" : "Type inferred from '\"./\"'" }, { "name" : "schemaname", "type" : "string", "doc" : "Type inferred from '\"aiyvoice\"'" }, { "name" : "commands0", "type" : "string", "doc" : "Type inferred from '\"process process process process process process process process processMy IP address is 192.168.1.199\n\"'" }, { "name" : "filename", "type" : "string", "doc" : "Type inferred from '\"commands.3088-3190.txt\"'" }, { "name" : "ssaddress", "type" : "string", "doc" : "Type inferred from '\"192.168.1.199:60896\"'" }, { "name" : "receivedFrom", "type" : "string", "doc" : "Type inferred from '\"AIY_GOOGLE_VOICE_PI\"'" }, { "name" : "sshost", "type" : "string", "doc" : "Type inferred from '\"192.168.1.199\"'" }, { "name" : "mimetype", "type" : "string", "doc" : "Type inferred from '\"text/plain\"'" }, { "name" : "tailfileoriginalpath", "type" : "string", "doc" : "Type inferred from '\"/opt/demo/commands.txt\"'" }, { "name" : "uuid", "type" : "string", "doc" : "Type inferred from '\"76232a9a-baf1-4f6b-9707-6d532653bbe8\"'" }, { "name" : "RouteOnAttributeRoute", "type" : "string", "doc" : "Type inferred from '\"commands\"'" } ] } Apache NiFi Created Apache Hive DDL hive.ddl CREATE EXTERNAL TABLE IF NOT EXISTS aiyvoice (path STRING, filename STRING, commands0 STRING, ssaddress STRING, receivedFrom STRING, sshost STRING, mimetype STRING, tailfileoriginalpath STRING, uuid STRING, RouteOnAttributeRoute STRING) STORED AS ORC Resources:
https://github.com/google/aiyprojects-raspbian/blob/master/src/action.py https://github.com/google/aiyprojects-raspbian/blob/aiyprojects/HACKING.md https://aiyprojects.withgoogle.com/voice/#project-overview https://projects.raspberrypi.org/en/projects/google-voice-aiy https://github.com/googlesamples/assistant-sdk-python
https://developers.google.com/assistant/sdk/guides/library/python/embed/run-sample http://ktinkerer.co.uk/play-bbc-radio-raspberry-pi-aiy-project/ https://github.com/ktinkerer/aiyprojects-raspbian/tree/radio_player/src https://github.com/google/aiyprojects-raspbian https://github.com/n8kowald/raspi-voice-actions/blob/master/action.py https://github.com/n8kowald/raspi-voice-actions https://medium.com/@aallan/a-magic-mirror-powered-by-aiy-projects-and-the-raspberry-pi-e6a0fea3b4d6
... View more
Labels:
12-29-2017
03:17 PM
what version of Hive are using. 1.2 or 2.1? Is there permissions for reading on that HDFS location? S3? See: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-ManagedandExternalTables By default location is LOCATION '<hdfs_location>' ; LOCATION
's3a://BUCKET_NAME/tpcds_bin_partitioned_orc_200.db/inventory'; You need to specify that it is stored in S3 and has permissions. By default it assumes the path is in HDFS. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_data-access/content/moving_data_from_hdfs_to_hive_external_table_method.html https://community.hortonworks.com/articles/25578/how-to-access-data-files-stored-in-aws-s3-buckets.html https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_cloud-data-access/content/s3-get-started.html https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_cloud-data-access/content/hive-exposing-data.html
... View more
12-29-2017
03:12 PM
It should be fixed in the current release of HDP 2.6. If not, please put in an official support request or JIRA ticket.
... View more
12-28-2017
06:51 PM
3 Kudos
Adding a Movidius NCS to a Raspberry Pi is a great way to augment your deep learning edge programming.
Hardware Setup
Raspberry Pi 3
Movidius NCS
PS 3 Eye Camera
128GB USB Stick
Software Setup
Standard NCS Setup
Python 3
sudo apt-get update
sudo apt-get install python-pip python-dev
sudo apt-get install python3-pip python3-dev
sudo apt-get install fswebcam
sudo apt-get install curl wget unzip
sudo apt-get install oracle-java8-jdk
sudo apt-get install sense-hat
sudo apt-get install octave -y
sudo apt install python3-gst-1.0
pip3 install psutil
Apache MiniFi https://www.apache.org/dyn/closer.lua?path=/nifi/minifi/0.3.0/minifi-0.3.0-bin.zip
git clone https://github.com/movidius/ncsdk.git
git clone https://github.com/movidius/ncappzoo.git
Run Example
~/workspace/ncappzoo/apps/image-classifier
Shell Script
/root/workspace/ncappzoo/apps/image-classifier/run.sh
#!/bin/bash
DATE=$(date +"%Y-%m-%d_%H%M")
fswebcam -r 1280x720 --no-banner /opt/demo/images/$DATE.jpg
python3 -W ignore image-classifier2.py /opt/demo/images/$DATE.jpg
Forked Python Code
I added some extra variables I like to send from IoT devices (time, ip address, CPU temp) for flows and formatted as JSON.
#!/usr/bin/python3
# ****************************************************************************
# Copyright(c) 2017 Intel Corporation.
# License: MIT See LICENSE file in root directory.
# ****************************************************************************
# How to classify images using DNNs on Intel Neural Compute Stick (NCS)
import mvnc.mvncapi as mvnc
import skimage
from skimage import io, transform
import numpy
import os
import sys
import json
import time
import sys, socket
import subprocess
import datetime
from time import sleep
from time import gmtime, strftime
starttime= strftime("%Y-%m-%d %H:%M:%S",gmtime())
# User modifiable input parameters
NCAPPZOO_PATH = os.path.expanduser( '~/workspace/ncappzoo' )
GRAPH_PATH = NCAPPZOO_PATH + '/caffe/GoogLeNet/graph'
IMAGE_PATH = sys.argv[1]
LABELS_FILE_PATH = NCAPPZOO_PATH + '/data/ilsvrc12/synset_words.txt'
IMAGE_MEAN = [ 104.00698793, 116.66876762, 122.67891434]
IMAGE_STDDEV = 1
IMAGE_DIM = ( 224, 224 )
# ---- Step 1: Open the enumerated device and get a handle to it -------------
# Look for enumerated NCS device(s); quit program if none found.
devices = mvnc.EnumerateDevices()
if len( devices ) == 0:
print( 'No devices found' )
quit()
# Get a handle to the first enumerated device and open it
device = mvnc.Device( devices[0] )
device.OpenDevice()
# ---- Step 2: Load a graph file onto the NCS device -------------------------
# Read the graph file into a buffer
with open( GRAPH_PATH, mode='rb' ) as f:
blob = f.read()
# Load the graph buffer into the NCS
graph = device.AllocateGraph( blob )
# ---- Step 3: Offload image onto the NCS to run inference -------------------
# Read & resize image [Image size is defined during training]
img = print_img = skimage.io.imread( IMAGE_PATH )
img = skimage.transform.resize( img, IMAGE_DIM, preserve_range=True )
# Convert RGB to BGR [skimage reads image in RGB, but Caffe uses BGR]
img = img[:, :, ::-1]
# Mean subtraction & scaling [A common technique used to center the data]
img = img.astype( numpy.float32 )
img = ( img - IMAGE_MEAN ) * IMAGE_STDDEV
# Load the image as a half-precision floating point array
graph.LoadTensor( img.astype( numpy.float16 ), 'user object' )
# ---- Step 4: Read & print inference results from the NCS -------------------
# Get the results from NCS
output, userobj = graph.GetResult()
labels = numpy.loadtxt( LABELS_FILE_PATH, str, delimiter = '\t' )
order = output.argsort()[::-1][:6]
#### Initialization
external_IP_and_port = ('198.41.0.4', 53) # a.root-servers.net
socket_family = socket.AF_INET
host = os.uname()[1]
def getCPUtemperature():
res = os.popen('vcgencmd measure_temp').readline()
return(res.replace("temp=","").replace("'C\n",""))
def IP_address():
try:
s = socket.socket(socket_family, socket.SOCK_DGRAM)
s.connect(external_IP_and_port)
answer = s.getsockname()
s.close()
return answer[0] if answer else None
except socket.error:
return None
cpuTemp=int(float(getCPUtemperature()))
ipaddress = IP_address()
# yyyy-mm-dd hh:mm:ss
currenttime= strftime("%Y-%m-%d %H:%M:%S",gmtime())
row = { 'label1': labels[order[0]], 'label2': labels[order[1]], 'label3': labels[order[2]], 'label4': labels[order[3]], 'label5': labels[order[4]], 'currenttime': currenttime, 'host': host, 'cputemp': round(cpuTemp,2), 'ipaddress': ipaddress, 'starttime': starttime }
json_string = json.dumps(row)
print(json_string)
# ---- Step 5: Unload the graph and close the device -------------------------
graph.DeallocateGraph()
device.CloseDevice()
# ==== End of file ===========================================================
This same device also has a Sense-Hat, so we grab those as well.
Then I combined the two sets of code to produce a row of Sense-Hat sensor data and NCS AI results.
Source Code
https://github.com/tspannhw/rpi-minifi-movidius-sensehat/tree/master
Example Output JSON
{"label1": "b'n03794056 mousetrap'", "humidity": 17.2, "roll": 94.0, "yaw": 225.0, "label4": "b'n03127925 crate'", "cputemp2": 49.39, "label3": "b'n02091134 whippet'", "memory": 16.2, "z": -0.0, "diskfree": "15866.3 MB", "cputemp": 50, "pitch": 3.0, "temp": 34.9, "y": 1.0, "currenttime": "2017-12-28 19:09:32", "starttime": "2017-12-28 19:09:29", "tempf": 74.82, "label5": "b'n04265275 space heater'", "host": "sensehatmovidius", "ipaddress": "192.168.1.156", "pressure": 1032.7, "label2": "b'n02091032 Italian greyhound'", "x": -0.0}
The current version of Apache MiniFi is 0.30.
Make sure you download the 0.30 version of the Toolkit to convert your template created in your Apache NiFi screen to a proper config.yml. This is compatible with Apache NiFi 1.4.
MiNiFi (Java) Version 0.3.0 Release Date: 2017 December 22 (https://cwiki.apache.org/confluence/display/MINIFI/Release+Notes#ReleaseNotes-Version0.3.0)
To Build Your MiniFi YAML
buildconfig.sh YourConfigFileName.xml
minifi-toolkit-0.3.0/bin/config.sh transform $1 config.yml
scp config.yml pi@192.168.1.156:/opt/dmeo/minifi-0.3.0/conf/
MiNiFi Config Version: 3
Flow Controller:
name: Movidius MiniFi
comment: ''
Core Properties:
flow controller graceful shutdown period: 10 sec
flow service write delay interval: 500 ms
administrative yield duration: 30 sec
bored yield duration: 10 millis
max concurrent threads: 1
variable registry properties: ''
FlowFile Repository:
partitions: 256
checkpoint interval: 2 mins
always sync: false
Swap:
threshold: 20000
in period: 5 sec
in threads: 1
out period: 5 sec
out threads: 4
Content Repository:
content claim max appendable size: 10 MB
content claim max flow files: 100
always sync: false
Provenance Repository:
provenance rollover time: 1 min
implementation: org.apache.nifi.provenance.MiNiFiPersistentProvenanceRepository
Component Status Repository:
buffer size: 1440
snapshot frequency: 1 min
Security Properties:
keystore: ''
keystore type: ''
keystore password: ''
key password: ''
truststore: ''
truststore type: ''
truststore password: ''
ssl protocol: ''
Sensitive Props:
key:
algorithm: PBEWITHMD5AND256BITAES-CBC-OPENSSL
provider: BC
Processors: []
Controller Services: []
Process Groups:
- id: 9fdb6f99-2e9f-3d86-0000-000000000000
name: Movidius MiniFi
Processors:
- id: ceab4537-43cb-31c8-0000-000000000000
name: Capture Photo and NCS Image Classifier
class: org.apache.nifi.processors.standard.ExecuteProcess
max concurrent tasks: 1
scheduling strategy: TIMER_DRIVEN
scheduling period: 60 sec
penalization period: 30 sec
yield period: 1 sec
run duration nanos: 0
auto-terminated relationships list: []
Properties:
Argument Delimiter: ' '
Batch Duration:
Command: /root/workspace/ncappzoo/apps/image-classifier/run.sh
Command Arguments:
Redirect Error Stream: 'false'
Working Directory:
- id: 9fbeac0f-1c3f-3535-0000-000000000000
name: GetFile
class: org.apache.nifi.processors.standard.GetFile
max concurrent tasks: 1
scheduling strategy: TIMER_DRIVEN
scheduling period: 0 sec
penalization period: 30 sec
yield period: 1 sec
run duration nanos: 0
auto-terminated relationships list: []
Properties:
Batch Size: '10'
File Filter: '[^\.].*'
Ignore Hidden Files: 'false'
Input Directory: /opt/demo/images/
Keep Source File: 'false'
Maximum File Age:
Maximum File Size:
Minimum File Age: 5 sec
Minimum File Size: 10 B
Path Filter:
Polling Interval: 0 sec
Recurse Subdirectories: 'true'
Controller Services: []
Process Groups: []
Input Ports: []
Output Ports: []
Funnels: []
Connections:
- id: 1bd103a9-09dc-3121-0000-000000000000
name: Capture Photo and NCS Image Classifier/success/0d6447ee-be36-3ec4-b896-1d57e374580f
source id: ceab4537-43cb-31c8-0000-000000000000
source relationship names:
- success
destination id: 0d6447ee-be36-3ec4-b896-1d57e374580f
max work queue size: 10000
max work queue data size: 1 GB
flowfile expiration: 0 sec
queue prioritizer class: ''
- id: 3cdac0a3-aabf-3a15-0000-000000000000
name: GetFile/success/0d6447ee-be36-3ec4-b896-1d57e374580f
source id: 9fbeac0f-1c3f-3535-0000-000000000000
source relationship names:
- success
destination id: 0d6447ee-be36-3ec4-b896-1d57e374580f
max work queue size: 10000
max work queue data size: 1 GB
flowfile expiration: 0 sec
queue prioritizer class: ''
Remote Process Groups:
- id: c8b85dc1-0dfb-3f01-0000-000000000000
name: ''
url: http://hostname.local:8080/nifi
comment: ''
timeout: 60 sec
yield period: 10 sec
transport protocol: HTTP
proxy host: ''
proxy port: ''
proxy user: ''
proxy password: ''
local network interface: ''
Input Ports:
- id: befe3bd8-b4ac-326c-becf-fc448e4a8ff6
name: MiniFi From TX1 Jetson
comment: ''
max concurrent tasks: 1
use compression: false
- id: 0d6447ee-be36-3ec4-b896-1d57e374580f
name: Movidius Input
comment: ''
max concurrent tasks: 1
use compression: false
- id: e43fd055-eb0c-37b2-ad38-91aeea9380cd
name: ChristmasTreeInput
comment: ''
max concurrent tasks: 1
use compression: false
- id: 4689a182-d7e1-3371-bef2-24e102ed9350
name: From ADP Remote Partition 1
comment: ''
max concurrent tasks: 1
use compression: false
Output Ports: []
Input Ports: []
Output Ports: []
Funnels: []
Connections: []
Remote Process Groups: []
NiFi Properties Overrides: {}
Start Apache MiniFi on Your RPI Box
root@sensehatmovidius:/opt/demo/minifi-0.3.0/logs# ../bin/minifi.sh start1686],offset=0,name=2017-12-28_1452.jpg,size=531686], StandardFlowFileRecord[uuid=ca84bb5f-b9ff-40f2-b5f3-ece5278df9c2,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1514490624975-1, container=default, section=1], offset=1592409, length=531879],offset=0,name=2017-12-28_1453.jpg,size=531879], StandardFlowFileRecord[uuid=6201c682-967b-4922-89fb-d25abbc2fe82,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1514490624975-1, container=default, section=1], offset=2124819, length=527188],offset=0,name=2017-12-28_1455.jpg,size=527188]] (2.53 MB) to http://hostname.local:8080/nifi-api in 639 milliseconds at a rate of 3.96 MB/sec
2017-12-28 15:08:35,588 INFO [Timer-Driven Process Thread-8] o.a.nifi.remote.StandardRemoteGroupPort RemoteGroupPort[name=Movidius Input,targets=http://hostname.local:8080/nifi] Successfully sent [StandardFlowFileRecord[uuid=670a2494-5c60-4551-9a59-a1859e794cfe,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1514490883824-2, container=default, section=2], offset=530145, length=531289],offset=0,name=2017-12-28_1456.jpg,size=531289], StandardFlowFileRecord[uuid=39cad0fd-ccba-4386-840d-758b39daf48e,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1514491306665-2, container=default, section=2], offset=0, length=504600],offset=0,name=2017-12-28_1501.jpg,size=504600], StandardFlowFileRecord[uuid=3c2d4a2b-a10b-4a92-8c8b-ab73e6960670,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1514491298789-1, container=default, section=1], offset=0, length=527543],offset=0,name=2017-12-28_1502.jpg,size=527543], StandardFlowFileRecord[uuid=92cbfd2b-e94a-4854-b874-ef575dcf3905,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1514490883824-2, container=default, section=2], offset=0, length=529618],offset=0,name=2017-12-28_1454.jpg,size=529618]] (2 MB) to http://hostname.local:8080/nifi-api in 439 milliseconds at a rate of 4.54 MB/sec
2017-12-28 15:08:41,341 INFO [Timer-Driven Process Thread-2] o.a.nifi.remote.StandardRemoteGroupPort RemoteGroupPort[name=Movidius Input,targets=http://hostname.local:8080/nifi] Successfully sent [StandardFlowFileRecord[uuid=3fb7ca68-3e10-40c6-8f37-49532134689d,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1514491710195-1, container=default, section=1], offset=0, length=567],offset=0,name=21532511724407,size=567]] (567 bytes) to http://hostname.local:8080/nifi-api in 113 milliseconds at a rate of 4.88 KB/sec
Our Apache MiniFi flow sends that data via HTTP to our Apache NiFi server for processing and converting into a standard Apache ORC file in HDFS for Apache Hive queries.
Using InferAvroSchema we build a schema and upload it to the Hortonworks Schema Registry for processing.
Schema
{
"type": "record",
"name": "movidiussense",
"fields": [
{
"name": "cputemp2",
"type": "double",
"doc": "Type inferred from '54.77'"
},
{
"name": "pitch",
"type": "double",
"doc": "Type inferred from '4.0'"
},
{
"name": "label5",
"type": "string",
"doc": "Type inferred from '\"b'n04238763 slide rule, slipstick'\"'"
},
{
"name": "starttime",
"type": "string",
"doc": "Type inferred from '\"2017-12-28 20:08:37\"'"
},
{
"name": "label2",
"type": "string",
"doc": "Type inferred from '\"b'n02871525 bookshop, bookstore, bookstall'\"'"
},
{
"name": "y",
"type": "double",
"doc": "Type inferred from '1.0'"
},
{
"name": "currenttime",
"type": "string",
"doc": "Type inferred from '\"2017-12-28 20:08:39\"'"
},
{
"name": "x",
"type": "double",
"doc": "Type inferred from '-0.0'"
},
{
"name": "label4",
"type": "string",
"doc": "Type inferred from '\"b'n03482405 hamper'\"'"
},
{
"name": "roll",
"type": "double",
"doc": "Type inferred from '94.0'"
},
{
"name": "temp",
"type": "double",
"doc": "Type inferred from '35.31'"
},
{
"name": "host",
"type": "string",
"doc": "Type inferred from '\"sensehatmovidius\"'"
},
{
"name": "diskfree",
"type": "string",
"doc": "Type inferred from '\"15817.0 MB\"'"
},
{
"name": "memory",
"type": "double",
"doc": "Type inferred from '39.2'"
},
{
"name": "label3",
"type": "string",
"doc": "Type inferred from '\"b'n02666196 abacus'\"'"
},
{
"name": "label1",
"type": "string",
"doc": "Type inferred from '\"b'n02727426 apiary, bee house'\"'"
},
{
"name": "yaw",
"type": "double",
"doc": "Type inferred from '225.0'"
},
{
"name": "cputemp",
"type": "int",
"doc": "Type inferred from '55'"
},
{
"name": "humidity",
"type": "double",
"doc": "Type inferred from '17.2'"
},
{
"name": "tempf",
"type": "double",
"doc": "Type inferred from '75.56'"
},
{
"name": "z",
"type": "double",
"doc": "Type inferred from '-0.0'"
},
{
"name": "pressure",
"type": "double",
"doc": "Type inferred from '1033.2'"
},
{
"name": "ipaddress",
"type": "string",
"doc": "Type inferred from '\"192.168.1.156\"'"
}
]
}
Our data is now in HDFS with an Apache Hive table that Apache NiFi has generated the Create Table DDL for as follows:
CREATE EXTERNAL TABLE IF NOT EXISTS movidiussense (cputemp2 DOUBLE, pitch DOUBLE, label5 STRING, starttime STRING, label2 STRING, y DOUBLE, currenttime STRING, x DOUBLE, label4 STRING, roll DOUBLE, temp DOUBLE, host STRING, diskfree STRING, memory DOUBLE, label3 STRING, label1 STRING, yaw DOUBLE, cputemp INT, humidity DOUBLE, tempf DOUBLE, z DOUBLE, pressure DOUBLE, ipaddress STRING) STORED AS ORC LOCATION '/movidiussense'
Resources
https://developer.movidius.com/
https://github.com/tspannhw/rpi-minifi-movidius-sensehat
http://chris.gg/2012/07/using-a-ps3-eyetoy-with-the-raspberry-pi/
https://developer.movidius.com/start
https://github.com/movidius/ncsdk/blob/master/README.md
... View more
Labels:
12-28-2017
05:03 PM
you can add a reporting task for zabbix https://github.com/dbist/workshops/tree/master/nifi/templates/zabbix https://pierrevillard.com/2017/05/11/monitoring-nifi-introduction/
... View more