1973
Posts
1225
Kudos Received
124
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1914 | 04-03-2024 06:39 AM | |
| 3011 | 01-12-2024 08:19 AM | |
| 1643 | 12-07-2023 01:49 PM | |
| 2420 | 08-02-2023 07:30 AM | |
| 3361 | 03-29-2023 01:22 PM |
02-07-2018
05:54 PM
if you have cygwin https://www.cygwin.com/ you can try that. I am running it on OSX, Ubuntu and Centos fine.
... View more
02-07-2018
05:53 PM
I am thinking Windows is the issue, but I am checking.
... View more
02-07-2018
05:46 PM
What JVM? JDK? Server platform? How did you install? Are you following this: https://github.com/apache/nifi-registry#getting-started or https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.0/bk_installing-hdf/content/ch_install-ambari.html
... View more
02-07-2018
04:26 PM
5 Kudos
Use Case: Ingesting energy data and running an Apache Spark job as part of the flow. We will be using the new (in Apache NiFi 1.5 / HDF 3.1) ExecuteSparkInteractive processor with the LivyController to accomplish that integration. As we mentioned in the first part of the article, it's pretty easy to set this up. Since this is a modern Apache NiFi project, we use version control on our code: On a local machine, I am talking to an electricity sensor over WiFi in a Python script. This code is processed, cleaned and sent to a cloud hosted Apache NiFi instance via S2S over HTTP. In the cloud we receive the pushed messages. Once we open the Spark It Up processor group, we have a flow to process the data. Flow Overview QueryRecord: Determine how to route based on query on streaming data. Converts JSON to Apache AVRO. Path for All Files UpdateAttribute: Set a schema MergeContent: Do an Apache AVRO merge on our data to make bigger files. ConvertAvroToORC: Build an Apache ORC file from merged Apache AVRO file. PutHDFS: Store our Apache ORC file in an HDFS directory on our HDP 2.6.4 cluster. Path For Large Voltage ExecuteSparkInteractive: Call our PySpark job PutHDFS: Store the results to HDFS. We could take all the metadata attributes and send them somewhere or store them as a JSON file. We tested our PySpark program in Apache Zeppelin and then copy it to our processor. Our ExecuteSparkInteractive Processor: In our QueryProcessor we send messages with large voltages to the Apache Spark executor to run a PySpark job to do some more processing. Once we have submitted a job via Apache Livy, we are now able to see the job during and after execution with detailed Apache Livy UI screens and Spark screens. In the Apache Livy UI screen below we can see the PySpark code executed and it's output. Apache Livy UI Apache Spark Jobs UI - Jobs Apache Spark Jobs UI - SQL Apache Spark Jobs UI - Executors Apache Zeppelin SQL Search of the Data Hive / Spark SQL Table DDL Generated Automagically by Apache NiFi Below are the source code related to this article: Source Code: https://github.com/tspannhw/nifi-spark-livy PySpark Code shdf = spark.read.json("hdfs://yourhdp264server:8020/spark2-history")
shdf.printSchema()
shdf.createOrReplaceTempView("sparklogs")
stuffdf = spark.sql("SELECT * FROM sparklogs")
stuffdf.count() This is a pretty simple PySpark application to read the JSON results of Spark2 History, print a schema inferred from it and then do a simple SELECT and count. We could do Spark machine learning or other processing in there very easily. You can run Python 2.x or 3.x for this with PySpark. I am running this in Apache Spark 2.2.0 hosted on a HDP 2.6.4 cluster running Centos 7. The fun part is that everytime I run this Spark job it produces more results for it to read. I should probably just read that log in Apache NiFi, but it was a fun little example. Clearly you can run any kind of job in here, my next article will be around running Apache MXNet and Spark MLib jobs through Apache Livy and Apache NiFi. For a quick side note, you have a lot of options for working with schemas now: Schema For Energy Data inferred.avro.schema
{ "type" : "record", "name" : "smartPlug", "fields" : [ { "name" : "day19", "type" : "double", "doc" : "Type inferred from '2.035'" }, { "name" : "day20", "type" : "double", "doc" : "Type inferred from '1.191'" }, { "name" : "day21", "type" : "double", "doc" : "Type inferred from '0.637'" }, { "name" : "day22", "type" : "double", "doc" : "Type inferred from '1.497'" }, { "name" : "day23", "type" : "double", "doc" : "Type inferred from '1.151'" }, { "name" : "day24", "type" : "double", "doc" : "Type inferred from '1.227'" }, { "name" : "day25", "type" : "double", "doc" : "Type inferred from '1.387'" }, { "name" : "day26", "type" : "double", "doc" : "Type inferred from '1.138'" }, { "name" : "day27", "type" : "double", "doc" : "Type inferred from '1.204'" }, { "name" : "day28", "type" : "double", "doc" : "Type inferred from '1.401'" }, { "name" : "day29", "type" : "double", "doc" : "Type inferred from '1.288'" }, { "name" : "day30", "type" : "double", "doc" : "Type inferred from '1.439'" }, { "name" : "day31", "type" : "double", "doc" : "Type inferred from '0.126'" }, { "name" : "day1", "type" : "double", "doc" : "Type inferred from '1.204'" }, { "name" : "day2", "type" : "double", "doc" : "Type inferred from '1.006'" }, { "name" : "day3", "type" : "double", "doc" : "Type inferred from '1.257'" }, { "name" : "day4", "type" : "double", "doc" : "Type inferred from '1.053'" }, { "name" : "day5", "type" : "double", "doc" : "Type inferred from '1.597'" }, { "name" : "day6", "type" : "double", "doc" : "Type inferred from '1.642'" }, { "name" : "day7", "type" : "double", "doc" : "Type inferred from '0.443'" }, { "name" : "day8", "type" : "double", "doc" : "Type inferred from '0.01'" }, { "name" : "day9", "type" : "double", "doc" : "Type inferred from '0.009'" }, { "name" : "day10", "type" : "double", "doc" : "Type inferred from '0.009'" }, { "name" : "day11", "type" : "double", "doc" : "Type inferred from '0.075'" }, { "name" : "day12", "type" : "double", "doc" : "Type inferred from '1.149'" }, { "name" : "day13", "type" : "double", "doc" : "Type inferred from '1.014'" }, { "name" : "day14", "type" : "double", "doc" : "Type inferred from '0.851'" }, { "name" : "day15", "type" : "double", "doc" : "Type inferred from '1.134'" }, { "name" : "day16", "type" : "double", "doc" : "Type inferred from '1.54'" }, { "name" : "day17", "type" : "double", "doc" : "Type inferred from '1.438'" }, { "name" : "day18", "type" : "double", "doc" : "Type inferred from '1.056'" }, { "name" : "sw_ver", "type" : "string", "doc" : "Type inferred from '\"1.1.1 Build 160725 Rel.164033\"'" }, { "name" : "hw_ver", "type" : "string", "doc" : "Type inferred from '\"1.0\"'" }, { "name" : "mac", "type" : "string", "doc" : "Type inferred from '\"50:C7:BF:B1:95:D5\"'" }, { "name" : "type", "type" : "string", "doc" : "Type inferred from '\"IOT.SMARTPLUGSWITCH\"'" }, { "name" : "hwId", "type" : "string", "doc" : "Type inferred from '\"60FF6B258734EA6880E186F8C96DDC61\"'" }, { "name" : "fwId", "type" : "string", "doc" : "Type inferred from '\"060BFEA28A8CD1E67146EB5B2B599CC8\"'" }, { "name" : "oemId", "type" : "string", "doc" : "Type inferred from '\"FFF22CFF774A0B89F7624BFC6F50D5DE\"'" }, { "name" : "dev_name", "type" : "string", "doc" : "Type inferred from '\"Wi-Fi Smart Plug With Energy Monitoring\"'" }, { "name" : "model", "type" : "string", "doc" : "Type inferred from '\"HS110(US)\"'" }, { "name" : "deviceId", "type" : "string", "doc" : "Type inferred from '\"8006ECB1D454C4428953CB2B34D9292D18A6DB0E\"'" }, { "name" : "alias", "type" : "string", "doc" : "Type inferred from '\"Tim Spann's MiniFi Controller SmartPlug - Desk1\"'" }, { "name" : "icon_hash", "type" : "string", "doc" : "Type inferred from '\"\"'" }, { "name" : "relay_state", "type" : "int", "doc" : "Type inferred from '1'" }, { "name" : "on_time", "type" : "int", "doc" : "Type inferred from '1995745'" }, { "name" : "active_mode", "type" : "string", "doc" : "Type inferred from '\"schedule\"'" }, { "name" : "feature", "type" : "string", "doc" : "Type inferred from '\"TIM:ENE\"'" }, { "name" : "updating", "type" : "int", "doc" : "Type inferred from '0'" }, { "name" : "rssi", "type" : "int", "doc" : "Type inferred from '-34'" }, { "name" : "led_off", "type" : "int", "doc" : "Type inferred from '0'" }, { "name" : "latitude", "type" : "double", "doc" : "Type inferred from '40.268216'" }, { "name" : "longitude", "type" : "double", "doc" : "Type inferred from '-74.529088'" }, { "name" : "index", "type" : "int", "doc" : "Type inferred from '18'" }, { "name" : "zone_str", "type" : "string", "doc" : "Type inferred from '\"(UTC-05:00) Eastern Daylight Time (US & Canada)\"'" }, { "name" : "tz_str", "type" : "string", "doc" : "Type inferred from '\"EST5EDT,M3.2.0,M11.1.0\"'" }, { "name" : "dst_offset", "type" : "int", "doc" : "Type inferred from '60'" }, { "name" : "month1", "type" : "double", "doc" : "Type inferred from '32.674'" }, { "name" : "month2", "type" : "double", "doc" : "Type inferred from '8.202'" }, { "name" : "current", "type" : "double", "doc" : "Type inferred from '0.772548'" }, { "name" : "voltage", "type" : "double", "doc" : "Type inferred from '121.740428'" }, { "name" : "power", "type" : "double", "doc" : "Type inferred from '91.380606'" }, { "name" : "total", "type" : "double", "doc" : "Type inferred from '48.264'" }, { "name" : "time", "type" : "string", "doc" : "Type inferred from '\"02/07/2018 11:17:30\"'" }, { "name" : "ledon", "type" : "boolean", "doc" : "Type inferred from 'true'" }, { "name" : "systemtime", "type" : "string", "doc" : "Type inferred from '\"02/07/2018 11:17:30\"'" } ] } Python Source (Updated to include 31 days) from pyHS100 import SmartPlug, SmartBulb
#from pprint import pformat as pf
import json
import datetime
plug = SmartPlug("192.168.1.203")
row = { }
emeterdaily = plug.get_emeter_daily(year=2017, month=12)
for k, v in emeterdaily.items():
row["day%s" % k] = v
emeterdaily = plug.get_emeter_daily(year=2018, month=1)
for k, v in emeterdaily.items():
row["day%s" % k] = v
emeterdaily = plug.get_emeter_daily(year=2018, month=2)
for k, v in emeterdaily.items():
row["day%s" % k] = v
hwinfo = plug.hw_info
for k, v in hwinfo.items():
row["%s" % k] = v
sysinfo = plug.get_sysinfo()
for k, v in sysinfo.items():
row["%s" % k] = v
timezone = plug.timezone
for k, v in timezone.items():
row["%s" % k] = v
emetermonthly = plug.get_emeter_monthly(year=2018)
for k, v in emetermonthly.items():
row["month%s" % k] = v
realtime = plug.get_emeter_realtime()
for k, v in realtime.items():
row["%s" % k] = v
row['alias'] = plug.alias
row['time'] = plug.time.strftime('%m/%d/%Y %H:%M:%S')
row['ledon'] = plug.led
row['systemtime'] = datetime.datetime.now().strftime('%m/%d/%Y %H:%M:%S')
json_string = json.dumps(row)
print(json_string)
Example Output {"text\/plain":"root\n |-- App Attempt ID: string (nullable = true)\n |-- App ID: string (nullable = true)\n |-- App Name: string (nullable = true)\n |-- Block Manager ID: struct (nullable = true)\n | |-- Executor ID: string (nullable = true)\n | |-- Host: string (nullable = true)\n | |-- Port: long (nullable = true)\n |-- Classpath Entries: struct (nullable = true)\n | |-- \/etc\/hadoop\/conf\/: string (nullable = true)\n | |-- \/etc\/hadoop\/conf\/secure: string (nullable = true)\n | |-- \/etc\/zeppelin\/conf\/external-dependency-conf\/: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_conf__: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/JavaEWAH-0.3.2.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/RoaringBitmap-0.5.11.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/ST4-4.0.4.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/activation-1.1.1.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/aircompressor-0.8.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/antlr-2.7.7.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/antlr-runtime-3.4.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/application_1517883514475_0002\/container_e01_1517883514475_0002_01_000001\/__spark_libs__\/antlr4-runtime-4.5.3.jar: string (nullable = true)\n | |-- \/hadoop\/yarn\/local\/usercache\/livy\/appcache\/ Shell Tip: Apache MXnet may have some warnings sent to STDERR. I don't want these, so send them to /dev/null: python3 -W ignore analyze.py 2>/dev/null Software:
PySpark Python Apache NiFi Apache Spark HDF 3.1 HDP 2.6.4 Apache Hive Apache Avro Apache ORC Apache Ambari Apache Zeppelin Reference:
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-livy-nar/1.5.0/org.apache.nifi.processors.livy.ExecuteSparkInteractive/index.html https://community.hortonworks.com/articles/73828/submitting-spark-jobs-from-apache-nifi-using-livy.html https://community.hortonworks.com/articles/148730/integrating-apache-spark-2x-jobs-with-apache-nifi.html https://community.hortonworks.com/articles/155326/monitoring-energy-usage-utilizing-apache-nifi-pyth.html https://github.com/tspannhw/nifi-smartplug/ https://github.com/tspannhw/nifi-spark-livy
... View more
02-07-2018
04:15 PM
HDP 2.64 is not supported on Windows. Linux, especially Centos 7, is perfect. You could do some Hadoop experiments in a VM and docker on Windows (https://hortonworks.com/tutorial/sandbox-deployment-and-install-guide/) Check out docker hub https://hub.docker.com/r/hortonworks/ambari-server/ https://hub.docker.com/u/hortonworks/
... View more
02-06-2018
02:21 PM
Options: You can upgrade. Have non-null messages. Or apply a hotfix. https://github.com/hortonworks/streamline/commit/fdde4fb545b1e3027e4bb7cf364e4fac334bb72c You can contact support to assist with upgrade or hotfix
... View more
02-06-2018
02:20 PM
There was bad messages in my Kafka queue that did not have schemas and were not in valid AVRO. This is the cause. Stop Kafka. Stop Storm. Stop SAM. Clean out your topics, delete the topic and recreate. Send a valid AVRO message with a valid subscription.
... View more
01-29-2018
01:00 AM
2 Kudos
'
The Onion Omega 2+ is a small IoT device that runs a simple busybox Linux and can run Micropython. This let's you run some simple application and interact with some sensors and an OLED.
Onion Omega 2+ Stats
580MHz Cpu
128MB memory
32mb storage
added usb 32gb
usb 2
microusb
b/g/n wifi
15 gpio
2 pwm
2 uart
1 i2c
1 spi
1 i2s
Setting Up the Omega
opkg install python-pip
pip install --upgrade setuptools
pip install paho-mqtt
opkg install pyOledExp
Upgrading pyOledExp on root from 0.4-1 to 0.5-1...
Downloading http://repo.onion.io/omega2/packages/onion/pyOledExp_0.5-1_mipsel_24kc.ipk
Configuring pyOledExp.
mkdir /mnt/sda1
mount /dev/sda1 /mnt/sda1
./run.sh
> Initializing display
> Setting display to ON
> Enabling horizontal scrolling to the left
> Writing '[{"ipaddress": "192.168.1.176", "endtime": "2018-01-29 00:50:39", "end": "1517187039.44"}]' to display
0
crontab -e
crontab -l
*/1 * * * * /opt/demo/run.sh
1517187305: New connection from 192.168.1.176 on port 1883.
1517187305: New client connected from 192.168.1.176 as onion (c1, k60).
1517187305: Client onion disconnected.
BusyBox v1.26.2 () built-in shell (ash)
____ _ ____
/ __ \___ (_)__ ___ / __ \__ _ ___ ___ ____ _
/ /_/ / _ \/ / _ \/ _ \ / /_/ / ' \/ -_) _ `/ _ `/
\____/_//_/_/\___/_//_/ \____/_/_/_/\__/\_, /\_,_/
W H A T W I L L Y O U I N V E N T ? /___/
-----------------------------------------------------
Ω-ware: 0.1.10 b160
-----------------------------------------------------
poweroff
Attributes Related to MQTT Message Sent
Example Flow File containing JSON
Apache NiFi Flow File to Process
Running MQTT on an Mac
/usr/local/Cellar/mosquitto/1.4.14_2/sbin/mosquitto -c /usr/local/etc/mosquitto/mosquitto.conf
1517180449: mosquitto version 1.4.14 (build date 2017-10-22 16:34:22+0100) starting
1517180449: Config loaded from /usr/local/etc/mosquitto/mosquitto.conf.
1517180449: Opening ipv6 listen socket on port 1883.
1517180449: Opening ipv4 listen socket on port 1883.
1517180698: New connection from 127.0.0.1 on port 1883.
1517180698: New client connected from 127.0.0.1 as nififorthemqttguy (c1, k60).
In our simple example we are just reading the time and IP Address of the device and format it in JSON to send as MQTT messages to an MQTT server read by Apache NiFi. This is a good framework to start with on tiny devices. With the Onion platform you can add GPS, sensors, USB devices, USB webcam and other inputs. These can easily be added to the Python script to send to Apache NiFi as JSON.
Source Code
https://github.com/tspannhw/onionomega-mqtt-micropython Python Script from OmegaExpansion import oledExp
import paho.mqtt.client as client
import time
import os
import datetime
import math
import random, string
import json
import sys
import socket
import json
from time import sleep
from string import Template
from time import gmtime, strftime
# Time
start = time.time()
currenttime= strftime("%Y-%m-%d %H:%M:%S",gmtime())
host = os.uname()[1]
external_IP_and_port = ('198.41.0.4', 53) # a.root-servers.net
socket_family = socket.AF_INET
def IP_address():
try:
s = socket.socket(socket_family, socket.SOCK_DGRAM)
s.connect(external_IP_and_port)
answer = s.getsockname()
s.close()
return answer[0] if answer else None
except socket.error:
return None
ipaddress = IP_address()
status = oledExp.driverInit()
status = oledExp.setDisplayPower(1)
status = oledExp.scroll (0, 0, 0, 8-1);
endtime= strftime("%Y-%m-%d %H:%M:%S",gmtime())
end = time.time()
row = [ { 'end': str(end), 'endtime': str(endtime), 'ipaddress': str(ipaddress) } ]
json_string = json.dumps(row)
broker="192.168.1.193"
port=1883
client1= client.Client("onion") #create client object
client1.connect(broker,port) #establish connection
ret= client1.publish("omega",json_string)
client1.disconnect()
status = oledExp.write(json_string)
print(status)
References https://community.hortonworks.com/articles/89455/ingesting-gps-data-from-onion-omega2-devices-with.html https://github.com/tspannhw/onionomega-mqtt-micropython https://github.com/mccollam/omega https://github.com/micropython/micropython-lib/tree/master/umqtt.simple https://docs.onion.io/omega2-docs/using-oled-expansion.html#using-the-libraries-2 https://iotbytes.wordpress.com/paho-mqtt-with-python/ https://www.kickstarter.com/projects/onion/omega2-5-iot-computer-with-wi-fi-powered-by-linux
... View more
Labels:
01-28-2018
06:31 PM
2 Kudos
The Matrix Creator is an interesting multiple sensor hat that fits on a Raspberry Pi 3. First step is to connect it, which is a simple snap, no soldering required. The specs are pretty impressive: Xilinx Spartan 6 XC6SLX4 FPGA Amtel Cortex-M3 ATSAM3S2 Microcontroller 8 MEMS MP34DB02 audio sensor digital microphones ST LSM9DS1 3D accelerometer, 3D gyroscope, 3d magnetometer IMU ST HTS221 capacitive digital sensor for relative humidity and temperature NXP MPL3115A2 precision pressure sensor with altimetry Silicon Labs EM358X - 2.4 GHz IEEE 802.15.4 Zigbee Sigma Designs ZM5202 - 868/908/921 MHz ZWave Vishay TSOP573 - carrier 38.0 kHz IR Receiver Vishay VEML6070 UV light sensor NXP PN512 NFC reader Everloop 35 RGBW LEDS It runs on Raspian lite and installs via: curl https://matrix-io.github.io/matrix-documentation/install.sh | sh Our Apache NiFi Flow For Processing the Three Types of Data Our Versioned Apache NiFi and MiniFi Flows We tail the three files produced by the three example Python sensors readers Both our MiniFi and Apache NiFi flows are very simple and documented above. Tail data from files as Python writes it, send from MiniFi to Apache NiFi which separate the files into different flows for future processing. We could create schemas, convert to JSON, merge the feeds with JSON, store them in three different data stores or more depending on what you want to do. This can be one on the edge or in Apache NiFi on a cluster. You could have MiniFi or NiFi trigger off specific values or ranges as they need arises. Or like me, you can just store it for later use in your endless HDFS Data Lake. Using Three Existing Examples Getting Temperature, UV and IMU Values. python /home/pi/matrix-creator-malos/src/python_test/test_humidity.py
nohup ./humidity.sh &
fh = open("/opt/demo/logs/humidity.log", "a")
fh.writelines('{0}'.format(humidity_info))
fh.close
python /home/pi/matrix-creator-malos/src/python_test/test_uv.py
/opt/demo/logs/uv.log
python /home/pi/matrix-creator-malos/src/python_test/test_imu.py
/opt/demo/logs/imu.log
/Volumes/seagate/Apps/minifi-toolkit-0.3.0/bin/config.sh transform $1 config.yml
scp config.yml pi@192.168.1.197:/opt/demo/minifi-0.3.0/conf Example Data imu.2603753-2604002.log
yaw: 141.655654907
roll: 1.66126561165
accel_x: -0.0261840820312
accel_y: 0.0283813476562
accel_z: 0.978576660156
gyro_x: -0.0672912597656
gyro_y: 2.06359863281
gyro_z: 1.33087158203
mag_x: 0.23982000351
mag_y: 0.189700007439
mag_z: -0.480480015278
uv.172512-172528.log
oms_risk: "Low"
humidity.29015-29074.log
temperature: 21.9526348114
temperature_is_calibrated: true References:
https://medium.com/kkbankol-events/raspberry-pi-15662c3ca881 https://creator.matrix.one/#!/examples https://github.com/matrix-io/matrix-creator-malos/blob/master/docs/pressure.md http://community.matrix.one/t/how-to-record-with-pyaudio/357 https://matrix-io.github.io/matrix-documentation/matrix-core/examples/pytests/ https://github.com/matrix-io/matrix-creator-alexa-voice-services https://matrix-io.github.io/matrix-documentation/matrix-hal/getting-started/installation/ https://matrix-io.github.io/matrix-documentation/matrix-core/examples/pytests/ https://matrix-io.github.io/matrix-documentation/setup/ https://www.matrix.one/products/creator
... View more
Labels:
01-28-2018
05:30 PM
2 Kudos
So I found another low-end affordable platform from China for running MiniFi goodness. For this 512MB RAM machine, I decided to use MiniFi CPP.
git clone https://github.com/apache/nifi-minifi-cpp.git
apt-get install cmake gcc g++ bison flex libcurl-dev librocksdb-dev librocksdb4.1 uuid-dev uuid libboost-all-dev libssl-dev libbz2-dev liblzma-dev doxygen -y
apt-get install -y libleveldb-dev
apt-get install -y libxml2
apt-get install libpython3-dev -y
apt-get install liblua5.1-0-dev -y
apt-get install libusb-1.0.0-0-dev libpng12-dev -y
apt-get install docker.io python-virtualenv -y
apt-get install libpython3-dev -y
apt-get install libgps-dev -y
apt-get install libpcap-dev -y
apt-get install cmake gcc g++ bison flex -y
./bootstrap.sh
interactive UI
cd nifi-minifi-cpp-0.3.0-source/
mkdir build
cd build
cmake ..
make
make package
apt-get install libssl-dev
-- The following features have been enabled:
* EXPRESSION LANGUAGE EXTENSIONS , This enables NiFi expression language
* HTTP CURL , This enables RESTProtocol, InvokeHTTP, and the HTTPClient for Site to Site
* ROCKSDB REPOS , This Enables persistent provenance, flowfile, and content repositories using RocksDB
* ARCHIVE EXTENSIONS , This Enables libarchive functionality including MergeContent, CompressContent, (Un)FocusArchiveEntry and ManipulateArchive.
* SCRIPTING EXTENSIONS , This enables scripting
-- The following OPTIONAL packages have been found:
* LibRt
* Git
* BZip2
* LibLZMA
* EXPAT
* Boost
* Doxygen
-- The following REQUIRED packages have been found:
* BISON
* FLEX
* CURL
* Threads
* PythonLibs
* ZLIB
* UUID
* OpenSSL
-- The following features have been disabled:
* CIVETWEB , This enables ListenHTTP
-- The following OPTIONAL packages have not been found:
* WinSock
* RocksDB
* LibArchive
* Nettle
* LibXml2
This will take some library installs for development and various libraries needed for networking, security and devices.
From the interactive UI, I selected to add most of the goodies except a USB camera since I don't have one on this tiny machine.
The CPP is different (and much smaller) than the Java one. For first, you have a specific set of processors, some of which are pretty cool like ones for USB Camera image ingestion, TensorFlow processing and other device goodness. You can browse the list in the PROCESSORs link below.
Since I built mine from the git clone of the master, I am running the 0.4.0 branch.
I could not install RocksDB on OrangePi
There's some cool stuff for reporting status.
root@orangepizero:/opt/demo/nifi-minifi-cpp-0.4.0# bin/minifi
minifi minificontroller minifi.sh
root@orangepizero:/opt/demo/nifi-minifi-cpp-0.4.0# bin/minifi.sh start
Starting MiNiFi with PID 15831 and pid file /opt/demo/nifi-minifi-cpp-0.4.0/bin/.minifi.pid
root@orangepizero:/opt/demo/nifi-minifi-cpp-0.4.0# [2018-01-infol 16:24:17.591] [main] [info] Using MINIFI_HOME=/opt/demo/nifi-minifi-cpp-0.4.0 from environment.
[2018-01-infol 16:24:17.592] [org::apache::nifi::minifi::Properties] [info] Using configuration file located at /opt/demo/nifi-minifi-cpp-0.4.0/conf/minifi-log.properties
[2018-01-26 16:24:17.893] [main] [info] Loading FlowController
[2018-01-26 16:24:17.893] [org::apache::nifi::minifi::FlowController] [info] Load Flow Controller from file /opt/demo/nifi-minifi-cpp-0.4.0/conf/config.yml
[2018-01-26 16:24:17.895] [org::apache::nifi::minifi::FlowController] [info] Loaded root processor Group
[2018-01-26 16:24:17.895] [org::apache::nifi::minifi::FlowController] [info] Initializing timers
[2018-01-26 16:24:17.896] [org::apache::nifi::minifi::FlowController] [info] Loaded controller service provider
[2018-01-26 16:24:17.896] [org::apache::nifi::minifi::FlowController] [info] Loaded flow repository
[2018-01-26 16:24:17.896] [org::apache::nifi::minifi::FlowController] [info] Starting Flow Controller
[2018-01-26 16:24:17.898] [org::apache::nifi::minifi::core::controller::StandardControllerServiceProvider] [info] Enabling % controller services
[2018-01-26 16:24:17.899] [org::apache::nifi::minifi::c2::C2Agent] [info] Class is RESTSender
[2018-01-26 16:24:17.902] [org::apache::nifi::minifi::io::Socket] [error] Could not bind to socket
[2018-01-26 16:24:17.903] [org::apache::nifi::minifi::FlowController] [info] Started Flow Controller
[2018-01-26 16:24:17.903] [main] [info] MiNiFi started
root@orangepizero:/opt/demo/nifi-minifi-cpp-0.4.0/bin# ./minificontroller --list components
[2018-01-infol 16:25:16.461] [controller] [info] MINIFI_HOME is not set; determining based on environment.
[2018-01-infol 16:25:16.462] [org::apache::nifi::minifi::Properties] [info] Using configuration file located at /opt/demo/nifi-minifi-cpp-0.4.0/conf/minifi.properties
[2018-01-infol 16:25:16.463] [org::apache::nifi::minifi::Properties] [info] Using configuration file located at /opt/demo/nifi-minifi-cpp-0.4.0/conf/minifi-log.properties
Components:
FlowController
root@orangepizero:/opt/demo/nifi-minifi-cpp-0.4.0/bin# ./minificontroller --list connections
[2018-01-infol 16:25:32.850] [controller] [info] MINIFI_HOME is not set; determining based on environment.
[2018-01-infol 16:25:32.851] [org::apache::nifi::minifi::Properties] [info] Using configuration file located at /opt/demo/nifi-minifi-cpp-0.4.0/conf/minifi.properties
[2018-01-infol 16:25:32.852] [org::apache::nifi::minifi::Properties] [info] Using configuration file located at /opt/demo/nifi-minifi-cpp-0.4.0/conf/minifi-log.properties
Connection Names:
./minificontroller --updateflow "config yml"
root@orangepizero:/opt/demo/nifi-minifi-cpp-0.4.0/bin# ./minificontroller --getfull
[2018-01-infol 16:26:13.296] [controller] [info] MINIFI_HOME is not set; determining based on environment.
[2018-01-infol 16:26:13.297] [org::apache::nifi::minifi::Properties] [info] Using configuration file located at /opt/demo/nifi-minifi-cpp-0.4.0/conf/minifi.properties
[2018-01-infol 16:26:13.298] [org::apache::nifi::minifi::Properties] [info] Using configuration file located at /opt/demo/nifi-minifi-cpp-0.4.0/conf/minifi-log.properties
0 are full
References
https://github.com/apache/nifi-minifi-cpp
https://cwiki.apache.org/confluence/display/MINIFI/C2+Design+Proposal
https://github.com/apache/nifi-minifi-cpp/blob/master/examples/BidirectionalSiteToSite/README.md
https://nifi.apache.org/minifi/getting-started.html
https://github.com/apache/nifi-minifi-cpp/blob/master/PROCESSORS.md#
https://github.com/apache/nifi-minifi-cpp/blob/master/EXPRESSIONS.md
https://cwiki.apache.org/confluence/display/MINIFI/Release+Notes#ReleaseNotes-Versioncpp-0.3.0
https://github.com/apache/nifi-minifi-cpp/blob/master/Extensions.md
To Customize C++ Builds
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=74685143
Build Extensions with MiniFi C++
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=74685988
MiniFi C++ System properties
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=70256416
... View more
Labels: