1973
Posts
1225
Kudos Received
124
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1843 | 04-03-2024 06:39 AM | |
| 2872 | 01-12-2024 08:19 AM | |
| 1584 | 12-07-2023 01:49 PM | |
| 2348 | 08-02-2023 07:30 AM | |
| 3241 | 03-29-2023 01:22 PM |
01-18-2019
09:29 PM
3 Kudos
I need to parse Kerberos KDC Log files (including the currently filling file) to find users with their host that are connecting. It seems using Grok in NiFi we can parse out a lot of different parts of these files and use them for filtering and alerting with ease. This is what many of the lines in the log file look like: Jan 01 03:31:01 somenewserver-310 krb5kdc[28593](info): AS_REQ (4 etypes {18 17 16 23}) 192.168.237.220: ISSUE: authtime 1546278185, etypes {rep=18 tkt=16 ses=18}, nn/somenewserver-310.field.hortonworks.com@HWX.COM for nn/somenewserver-310.field.hortonworks.com@HWX.COM State of the Tail Processor Tail a File We also have the option of using the GrokReader listed in an article included to immediately convert matching records to output formats like JSON or Avro and then partition into groups. We'll do that in a later article. In this one, we can get a line from the file via Tail, read a list of files and fetch one at a time or generate a flow file for testing. Once we had some data we'll start parsing into different message types. These messages can then be use for alerting, routing, permanent storage in Hive/Impala/HBase/Kudu/Druid/S3/Object Storage/etc... In the next step we will do some routing and alerting. Follow up by some natural language processing (NLP), machine learning and then we'll use various tools to search, aggregate, query, catalog, report on and build dashboards from this type of log and others.
Example Output JSON Formatted
PREAUTH
{
"date" : "Jan 07 02:25:15",
"etypes" : "2 etypes {23 16}",
"MONTH" : "Jan",
"HOUR" : "02",
"emailhost" : "cloudera.net",
"TIME" : "02:25:15",
"pid" : "21546",
"loghost" : "KDCHOST1",
"kuser" : "krbtgt",
"message" : "Additional pre-authentication required",
"emailuser" : "user1",
"MINUTE" : "25",
"SECOND" : "15",
"LOGLEVEL" : "info",
"MONTHDAY" : "01",
"apphost" : "APP_HOST1",
"kuserhost" : "cloudera.net@cloudera.net"
}
ISSUE
{
"date" : "Jan 01 03:20:09",
"etypes" : "2 etypes {23 18}",
"MONTH" : "Jan",
"HOUR" : "03",
"BASE10NUM" : "1546330809",
"emailhost" : "cloudera.net",
"TIME" : "03:20:09",
"pid" : "24546",
"loghost" : "KDCHOST1",
"kuser" : "krbtgt",
"message" : "",
"emailuser" : "user1",
"authtime" : "1546330809",
"MINUTE" : "20",
"SECOND" : "09",
"etypes2" : "rep=23 tkt=18 ses=23",
"LOGLEVEL" : "info",
"MONTHDAY" : "01",
"apphost" : "APP_HOST1",
"kuserhost" : "cloudera.net@cloudera.net"
}
Grok Expressions
For Parsing Failure Records
%{SYSLOGTIMESTAMP:date} %{HOSTNAME:loghost} krb5kdc\[%{POSINT:pid}\]\(%{LOGLEVEL}\): %{GREEDYDATA:premessage}failure%{GREEDYDATA:postmessage}
For Parsing PREAUTH Records
%{SYSLOGTIMESTAMP:date} %{HOSTNAME:loghost} krb5kdc\[%{POSINT:pid}\]\(%{LOGLEVEL}\): AS_REQ \(%{GREEDYDATA:etypes}\) %{GREEDYDATA:apphost}: NEEDED_PREAUTH: %{USERNAME:emailuser}@%{HOSTNAME:emailhost} for %{GREEDYDATA:kuser}/%{GREEDYDATA:kuserhost}, %{GREEDYDATA:message}
For Parsing ISSUE Records
%{SYSLOGTIMESTAMP:date} %{HOSTNAME:loghost} krb5kdc\[%{POSINT:pid}\]\(%{LOGLEVEL}\): AS_REQ \(%{GREEDYDATA:etypes}\) %{GREEDYDATA:apphost}: ISSUE: authtime %{NUMBER:authtime}, etypes \{%{GREEDYDATA:etypes2}\}, %{USERNAME:emailuser}@%{HOSTNAME:emailhost} for %{GREEDYDATA:kuser}/%{GREEDYDATA:kuserhost}%{GREEDYDATA:message}
Resources:
For Testing Grok Against Your Files
http://grokdebug.herokuapp.com/
A Great Article on Using GrokReader for Record Oriented Processing
https://community.hortonworks.com/articles/131320/using-partitionrecord-grokreaderjsonwriter-to-pars.html More About Grok https://datahovel.com/2018/07/ https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-record-serialization-services-nar/1.7.1/org.apache.nifi.grok.GrokReader/additionalDetails.html http://grokconstructor.appspot.com/do/automatic?example=0 https://gist.github.com/acobaugh/5aecffbaaa593d80022b3534e5363a2d
... View more
Labels:
01-07-2019
08:46 PM
6 Kudos
Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP. In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi. This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones. Drone Live Communications for Sensor Readings and Drone Control You must connect to the drone's WiFi, which will be Tello(Something). Tello IP: 192.168.10.1 UDP PORT:8889 Receive Tello Video Stream Tello IP: 192.168.10.1 UDP Server: 0.0.0.0 UDP PORT:11111 Example Install: pip3.6 install tellopy
git clone https://github.com/hanyazou/TelloPy
pip3.6 install av
pip3.6 install opencv-python
pip3.6 install image
python3.6 -m tellopy.examples.video_effect Example Run Video: https://www.youtube.com/watch?v=mYbStkcnhsk&t=0s&list=PL-7XqvSmQqfTSihuoIP_ZAnN7mFIHkZ_e&index=18 Example Flight Log: Tello-flight log.pdf. Let's build a quick ingest with Apache NiFi 1.8. Our first step we use a local Apache NiFi to read the CSV from the Drone rune locally. We read the CSVs from the Tello logging directory, add a schema definition and query it. We have a controller for CSV processing. We are using the posted schema and the Jackson CSV processor. We want to ignore the header as it has invalid characters. We use a QueryRecord to find if the position in Z has changed. SELECT * FROM FLOWFILE WHERE mvo_pos_z is not null AND CAST(mvo_pos_z as FLOAT) <> 0.0 We also convert from CSV to Apache AVRO format for further processing. Valid records are sent over HTTP(S) Site-to-Site to a cloud hosted Apache NiFi cluster for further processing to save to an HBase table. As you can see it's trival to store these records in HBase. For HBase, our data didn't have a record identifier, so I use the UpdateRecord processor to create one and add it to the data. I updated the schema to have this field (and have a default and allow nulls). As you can see it's pretty easy to store data to HBase. Schema: { "type" : "record", "name" : "drone",
"fields" : [
{ "name" : "drone_rec_id", "type" : [ "string", "null" ], "default": "1000" },
{ "name" : "mvo_vel_x", "type" : ["double","null"], "default": "0.00" },
{ "name" : "mvo_vel_y", "type" : ["string","null"], "default": "0.00" },
{ "name" : "mvo_vel_z", "type" : ["double","null"], "default": "0.00" },
{ "name" : "mvo_pos_x", "type" : ["string","null"], "default": "0.00" },
{ "name" : "mvo_pos_y", "type" : ["double","null"], "default": "0.00"},
{ "name" : "mvo_pos_z", "type" : ["string","null"], "default": "0.00" },
{ "name" : "imu_acc_x", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_acc_y", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_acc_z", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_gyro_x", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_gyro_y", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_gyro_z", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_q0", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_q1", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_q2", "type" : ["double","null"], "default": "0.00" },
{ "name" : "self_q3", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_vg_x", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_vg_y", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_vg_z", "type" : ["double","null"], "default": "0.00" } ] }
The updated schema now has a record id. The original schema derived from the raw data does not. Store the Data in HBase Table Soon we will be storing in Kudu, Impala, Hive, Druid and S3. create 'drone', 'drone' Source: We are using the TelloPy interface. You need to clone this github and drop in the files from nifi-drone. https://github.com/hanyazou/TelloPy/ https://github.com/tspannhw/nifi-drone Apache NiFi Flows: dronelocal.xml dronecloud.xml References: https://github.com/hanyazou/TelloPy https://gobot.io/blog/2018/04/20/hello-tello-hacking-drones-with-go/ https://github.com/grofattila/dji-tello https://github.com/dbaldwin/droneblocks-tello-python https://medium.com/@makerhacks/programming-the-ryze-dji-tello-with-python-eecd56fc2c27 https://github.com/Ubotica/telloCV/ https://www.instructables.com/id/Ultimate-Intelligent-Fully-Automatic-Drone-Robot-w/ https://github.com/hybridgroup/gobot/tree/master/platforms/dji/tello https://www.ryzerobotics.com/tello https://dl-cdn.ryzerobotics.com/downloads/Tello/20180404/Tello_User_Manual_V1.2_EN.pdf https://dl-cdn.ryzerobotics.com/downloads/Tello/20180212/Tello+Quick+Start+Guide_V1.2+multi.pdf https://dl-cdn.ryzerobotics.com/downloads/tello/20180910/Tello%20Scratch%20README.pdf https://dl-cdn.ryzerobotics.com/downloads/tello/20180910/scratch0907.7z https://www.ryzerobotics.com/tello/downloads https://www.hackster.io/econnie323/alexa-voice-controlled-tello-drone-760615 https://tellopilots.com/forums/tello-development.8/ https://medium.com/@swalters/dji-ryze-tello-drone-gets-reverse-engineered-46a65d83e6b5 http://www.fabriziomarini.com/2018/04/java-udp-drone-tello.html?m=1 https://github.com/microlinux/tello/blob/master/tello.py https://github.com/hybridgroup/gophercon-2018/blob/master/drone/tello/README.md https://tellopilots.com/threads/object-tracking-with-tello.1480/ https://github.com/gnamingo/jTello/blob/master/JTello.java https://github.com/microlinux/tello/blob/master/README.md https://steemit.com/python/@makerhacks/programming-the-ryze-dji-tello-with-python https://github.com/hanyazou/TelloPy https://github.com/dji-sdk/Tello-Python https://github.com/Ubotica/telloCV/ https://github.com/dji-sdk/Tello-Python/tree/master/Tello_Video_With_Pose_Recognition https://github.com/DaWelter/h264decoder https://github.com/twilightdema/h264j http://jcodec.org/ https://github.com/cisco/openh264 https://github.com/hanyazou/TelloPy/blob/develop-0.7.0/tellopy/examples/video_effect.py https://gobot.io/blog/2018/04/20/hello-tello-hacking-drones-with-go/ https://medium.com/@swalters/dji-ryze-tello-drone-gets-reverse-engineered-46a65d83e6b5
... View more
Labels:
12-29-2018
05:49 AM
4 Kudos
Implementing Streaming Machine Learning and Deep Learning In Production Part 1 After we have done our data exploration with Apache Zeppelin, Hortonworks Data Analytics Studio and other Data Science Notebooks and Tools, we will start building iterations of ever improving models that need to be used in live environments. These will need to run at scale and score millions of records in real-time streams. These can be in various frameworks, versions,types and many options of data required. There are a number of things we need to think about when doing this. Model Deployment Options Apache Spark Apache Storm (Hortonworks Streaming Analytics Manager - SAM) Apache Kafka Streams Apache NiFi YARN 3.1 YARN Submarine TensorFlow Serving on YARN Cloudera Data Science Workbench Requirements Classification REST API Security Automation Data Lineage Schema Versioning, REST API and Management Data Provenance Scripting Integration with Kafka Containerized Services Support Docker Containers running on YARN Support Dockerized Spark Jobs Model Registry Scalability Data Variety Data and Storage Format Flexiblity Handling Media Types such as images, sound and video Required Elements Apache NiFi 1.8.0 Apache Kafka 2.0 Apache Kafka Streams 2.0 Apache Atlas 1.0.0 Apache Ranger 1.2.0 Apache Knox 1.0 Hortonworks Streams Messaging Manager 1.2.0 Hortonworks Schema Registry 0.5.2 NiFi Registry 0.2.0 Apache Hadoop 3.1 Apache YARN 3.1+ Apache HDFS or Amazon S3 Apache Druid 0.12.1 Apache HBase 2.0 Apache Spark - Apache NiFi There are a number of options for running Machine Learning models in production via Apache NiFi. I have use these methods.
Apache NiFi to Apache
Spark Integration via Kafka and Spark Streaming Apache NiFi to Apache
Spark Integration via Kafka and Spark Structured Streaming Apache NiFi to Apache
Spark Integration via Apache Livy https://community.hortonworks.com/content/kbentry/174105/hdp-264-hdf-31-apache-spark-structured-streaming-i.html https://community.hortonworks.com/articles/174105/hdp-264-hdf-31-apache-spark-structured-streaming-i.html https://community.hortonworks.com/content/kbentry/171787/hdf-31-executing-apache-spark-via-executesparkinte.html Hadoop - YARN 3.1 - No Docker - No Spark We can deploy Deep Learning Models and run classification (as well as training) on YARN natively. https://community.hortonworks.com/content/kbentry/222242/running-apache-mxnet-deep-learning-on-yarn-31-hdp.html https://community.hortonworks.com/articles/224268/running-tensorflow-on-yarn-31-with-or-without-gpu.html Apache Kafka Streams Kafka Streams has full integration
Platform services including Schema Registry, Ranger and Ambari. Apache NiFi Native Java Processors for Classification We can use a custom processor in Java that runs as a native part of the dataflow. https://community.hortonworks.com/content/kbentry/116803/building-a-custom-processor-in-apache-nifi-12-for.html https://github.com/tspannhw/nifi-tensorflow-processor https://community.hortonworks.com/articles/229215/apache-nifi-processor-for-apache-mxnet-ssd-single.html https://github.com/tspannhw/nifi-mxnetinference-processor Apache NiFi Integration with a Model Server Native to a Framework Apache MXNet has an open source model server that has a full REST API that can easily be integrated with Apache NiFi. https://community.hortonworks.com/articles/155435/using-the-new-mxnet-model-server.html https://community.hortonworks.com/articles/223916/posting-images-with-apache-nifi-17-and-a-custom-pr.html https://community.hortonworks.com/articles/177232/apache-deep-learning-101-processing-apache-mxnet-m.html To run Apache MXNet model server is easy: mxnet-model-server --models
SSD=resnet50_ssd_model.model --service ssd_service.py --port 9998 TensorFlow also has a model server that supports gRPC and REST. https://www.tensorflow.org/serving/api_rest Hortonworks Streaming Analytics Manager (SAM) SAM supports running machine learning models exported as PMML as part of a flow. https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.1/getting-started-with-streaming-analytics/content/export_the_model_into_sam%27s_model_registry.html https://hortonworks.com/blog/part-4-sams-stream-builder-building-complex-stream-analytics-apps-without-code/ You can score the model in a fully graphical manner: https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.1/getting-started-with-streaming-analytics/content/score_the_model_using_the_pmml_processor_and_alert.html Deep Work on Model Governance and Integration with Apache Atlas:
Customizing Atlas (Part1): Model governance, traceability and registry Generalized Framework to Deploy Models and Integrate Apache Atlas for Model Governance Customizing Atlas (Part2): Deep source metadata & embedded entities Customizing Atlas (Part3): Lineage beyond Hadoop, including reports & emails References:
https://conferences.oreilly.com/strata/strata-ny-2018/public/schedule/detail/68140 https://apachecon.dukecon.org/acna/2018/#/scheduledEvent/7058e0d4f5ab28836 https://dataworkssummit.com/berlin-2018/session/iot-with-apache-mxnet-and-apache-nifi-and-minifi/ https://dataworkssummit.com/berlin-2018/session/apache-deep-learning-101/ https://dataworkssummit.com/san-jose-2018/session/open-source-computer-vision-with-tensorflow-apache-minifi-apache-nifi-opencv-apache-tika-and-python-2/ https://www.slideshare.net/bunkertor/apache-deep-learning-201-philly-open-source https://www.slideshare.net/bunkertor/running-apache-nifi-with-apache-spark-integration-options
... View more
Labels:
12-29-2018
06:15 AM
An example of an image.
... View more
12-19-2018
02:07 PM
7 Kudos
First off, this was an amazing year for Big Data, IoT, Kafka, AI, Streaming, Machine Learning and Deep Learning. So many cool events, updates, new products, new projects, new libraries and community growth. I've seen a lot of people adopt and grow Big Data and streaming projects from nothing. Using the power of Open Source and the tools made available by Apache, companies are growing with the help of trusted partners and a community of engineers and users. I had a fun year travelling to various conferences showing how to use Open Source technologies for IoT, Deep Learning, Machine Learning, Streaming, Big Data, Computer Vision and microservices. First up close to home was Iot Fusion in Philly: http://iotfusion.net/session/enterprise-iiot-edge-processing-with-apache-nifi-minifi-and-deep-learning/ It was a very cool event run by my friends at Chariot Solutions. We focused on some great technology from 2018 including MiniFi, Apache NiFi, TensorFlow and Apache MXNet. After that I headed to Europe and spoke at an awesome meetup in Prague. https://www.meetup.com/futureofdata-prague/events/248994531/ https://www.slideshare.net/bunkertor/deep-learning-on-hdp-2018-prague. A short train ride up to Berlin for a meetup https://www.meetup.com/futureofdata-berlin/events/249190513/ https://www.slideshare.net/bunkertor/minifi-and-apache-nifi-iot-in-berlin-germany-2018. Then I was in Berlin for Dataworks Summit and did two talks. https://dataworkssummit.com/berlin-2018/session/iot-with-apache-mxnet-and-apache-nifi-and-minifi/ and https://dataworkssummit.com/session/apache-deep-learning-101/. https://dzone.com/articles/apache-nifi-at-dws In June it was off to San Jose for the Big DataWorks Summit and I had a talk on Computer Vision. https://dataworkssummit.com/san-jose-2018/session/open-source-computer-vision-with-tensorflow-apache-minifi-apache-nifi-opencv-apache-tika-and-python-2/ https://www.slideshare.net/Hadoop_Summit/open-source-computer-vision-with-tensorflow-apache-minifi-apache-nifi-opencv-apache-tika-and-python I also helped out with some TensorFlow in Apache Zeppelin at the Deep Learning Crash Course https://dataworkssummit.com/san-jose-2018/crash-courses/. Come August, I joined the IoT Webinar https://hortonworks.com/webinar/iot-war-stories-challenges-solutions-best-practices/ which was a lot of fun and then got to speak with the awesome Data Scientist, John Kuchmek at the NJ Shore. Fall quickly followed and it was time for Strata in NYC. https://conferences.oreilly.com/strata/strata-ny-2018/public/schedule/speaker/185963 Again I spoke on IoT with AI https://github.com/tspannhw/StrataNYC2018 After Strata it was a quick flight up to Montreal for ApacheCon. It was great seeing all the Apache speakers and my friends from Apache MXNet. https://apachecon.dukecon.org/acna/2018/#/scheduledEvent/7058e0d4f5ab28836
https://feathercast.apache.org/2018/10/02/apache-deep-learning-101-timothy-spann/ In October I was down in Orlando with my AI buddies AppOrchid to speak about AI, IoT and Big Data for Utilities. http://www.utilityanalyticsweek.com/cr3ativspeaker/tim-spann/ https://www.slideshare.net/bunkertor/the-best-of-both-worlds-delivering-digital-transformation In November I got to be the first speaker at Philadelphia's first Open Source Conference hosted by Comcast! https://phillyopensource.splashthat.com/ What an amazing event and awesome people running this. Hats off to the Comcast team for support Open Source and having great projects. They also were big sponsors of Apache Con. To round it out, in December I reunited with my AI friends on a webinar for utilities https://utilityanalytics.com/webinar/providing-ai-solutions-to-break-the-data-silos-and-accelerate-utilities-into-the-forefront-of-data-driven-decisions/. The year was great for meetups and I got to talk at a few around the world, plus Milind and my meetup in Princeton grew to 1,100 members! Check us out: https://www.meetup.com/futureofdata-princeton/ My friend Paul restarted the https://www.meetup.com/futureofdata-philadelphia/ meetup and we will do some fun stuff in 2019 around the new Camden tech region. Nov 14 - Rafi from IBM https://www.meetup.com/futureofdata-princeton/events/254821251/ Oct 2nd - Mehul ( https://infinity-services.com/) and Abdhul (http://cloudsegue.com/) spoke. Hosted by TRAC Intermodal around Blockchain. https://www.meetup.com/futureofdata-princeton/events/252716511/
Sept 6 - Thomas spoke on MXNet https://www.meetup.com/futureofdata-princeton/events/253312744/
June 28 - I spoke on NiFi and Blockchain https://www.meetup.com/futureofdata-princeton/events/249163765/ Thanks to Prasad (Airisdata). Joint meetup in Hamilton with the NJ_Blockchain meetup. May 8 - I spoke on NiFi and IoT in NYC https://www.meetup.com/futureofdata-princeton/events/250304839/
April 17 - I spoke on NiFi in Berlin https://www.meetup.com/futureofdata-berlin/events/249190513/
April 12 - I spoke on NiFi and Deep Learning in Prague FoD https://www.meetup.com/futureofdata-prague/events/248994531/ Feb 13 - Big Mardis Gras celebration at TRAC Intermodal We have a meetup scheduled for January on Blockchain in Woodbridge. ChainNinja is going to broadcast this: https://www.meetup.com/futureofdata-princeton/events/255404291/ From meetups, webinars, workshops and conferences I had a few presentations:
https://www.slideshare.net/bunkertor/machine-learning-and-deep-learning-on-hdp-301-and-hdf-32
https://www.slideshare.net/bunkertor/apache-deep-learning-201-philly-open-source
https://www.slideshare.net/bunkertor/the-best-of-both-worlds-delivering-digital-transformation
https://www.slideshare.net/bunkertor/apache-deep-learning-101-apachecon-montreal-2018-v031
https://www.slideshare.net/bunkertor/handson-deep-dive-with-minifi-and-apache-mxnet
https://www.slideshare.net/bunkertor/open-source-predictive-analytics-pipeline-with-apache-nifi-and-minifi-princeton
https://www.slideshare.net/bunkertor/open-computer-vision-with-opencv-apache-nifi-tensorflow-python
https://www.slideshare.net/bunkertor/iot-edge-processing-with-apache-nifi-and-minifi-and-apache-mxnet-for-iot-ny-2018
https://www.slideshare.net/bunkertor/apache-mxnet-for-iot-with-apache-nifi
https://www.slideshare.net/bunkertor/apache-deep-learning-101-dws-berlin-2018
https://www.slideshare.net/bunkertor/minifi-and-apache-nifi-iot-in-berlin-germany-2018
https://www.slideshare.net/bunkertor/deep-learning-on-hdp-2018-prague
https://www.slideshare.net/bunkertor/enterprise-iiot-edge-processing-with-apache-nifi-92970386
https://www.slideshare.net/bunkertor/running-apache-nifi-with-apache-spark-integration-options
https://www.slideshare.net/bunkertor/hdf-31-an-introduction-to-new-features
I wrote a few articles for HCC in 2018:
https://community.hortonworks.com/articles/167193/building-and-running-minifi-cpp-in-orangepi-zero.html
https://community.hortonworks.com/articles/171787/hdf-31-executing-apache-spark-via-executesparkinte.html
https://community.hortonworks.com/articles/171893/hdf-31-executing-apache-spark-via-executesparkinte-1.html
https://community.hortonworks.com/articles/171960/using-apache-mxnet-on-an-apache-nifi-15-instance-w.html
https://community.hortonworks.com/articles/174227/apache-deep-learning-101-using-apache-mxnet-on-an.html
https://community.hortonworks.com/articles/174399/apache-deep-learning-101-using-apache-mxnet-on-apa.html
https://community.hortonworks.com/articles/176784/deep-learning-101-using-apache-mxnet-in-dsx-notebo.html
https://community.hortonworks.com/articles/176789/apache-deep-learning-101-using-apache-mxnet-in-apa.html
https://community.hortonworks.com/articles/176932/apache-deep-learning-101-using-apache-mxnet-on-the.html
https://community.hortonworks.com/articles/177232/apache-deep-learning-101-processing-apache-mxnet-m.html
https://community.hortonworks.com/articles/177663/apache-livy-apache-nifi-apache-spark-executing-sca.html
https://community.hortonworks.com/articles/178498/integrating-tensorflow-16-image-labelling-with-hdf.html
https://community.hortonworks.com/articles/183806/using-a-tensorflow-person-blocker-with-apache-nifi.html
https://community.hortonworks.com/articles/198912/ingesting-apache-mxnet-gluon-deep-learning-results.html
https://community.hortonworks.com/articles/222242/running-apache-mxnet-deep-learning-on-yarn-31-hdp.html
https://community.hortonworks.com/articles/193868/integrating-keras-tensorflow-yolov3-into-apache-ni.html
https://community.hortonworks.com/articles/198855/executing-tensorflow-classifications-from-apache-n.html
https://community.hortonworks.com/articles/198939/using-apache-mxnet-gluoncv-with-apache-nifi-for-de.html
https://community.hortonworks.com/articles/215079/iot-edge-processing-with-deep-learning-on-hdf-32-a.html
https://community.hortonworks.com/articles/224268/running-tensorflow-on-yarn-31-with-or-without-gpu.html
https://community.hortonworks.com/articles/203638/ingesting-multiple-iot-devices-with-apache-nifi-17.html
https://community.hortonworks.com/articles/227194/ingesting-and-analyzing-street-camera-data-from-ma.html
https://community.hortonworks.com/articles/207858/more-devops-for-hdf-apache-nifi-and-friends.html
https://community.hortonworks.com/articles/163776/parsing-any-document-with-apache-nifi-15-with-apac.html
https://community.hortonworks.com/articles/167187/provenance-site-to-site-reporting.html
https://community.hortonworks.com/articles/177370/extracting-html-from-pdf-excel-and-word-documents.html
https://community.hortonworks.com/articles/177733/apache-nifi-processor-building-a-sql-ddl-schema-fr.html
https://community.hortonworks.com/articles/192848/updating-the-apache-opennlp-community-apache-nifi.html
https://community.hortonworks.com/articles/193822/parsing-web-pages-for-images-with-apache-nifi.html
https://community.hortonworks.com/articles/193835/detecting-language-with-apache-nifi.html
https://community.hortonworks.com/articles/189213/etl-with-lookups-with-apache-hbase-and-apache-nifi.html
https://community.hortonworks.com/articles/222605/converting-powerpoint-presentations-into-french-fr.html
https://community.hortonworks.com/articles/223840/properties-file-lookup-augmentation-of-data-flow-i.html
https://community.hortonworks.com/articles/155604/iot-ingesting-camera-data-from-nanopi-duo-devices.html
https://community.hortonworks.com/articles/155606/iot-ingesting-gps-data-from-odroid-xu4-devices-wit.html
https://community.hortonworks.com/articles/161761/new-features-in-apache-nifi-15-apache-nifi-registr.html
https://community.hortonworks.com/articles/167196/ingesting-data-from-the-matrix-creator-with-minifi.html
https://community.hortonworks.com/articles/167199/sending-messages-and-displaying-them-on-an-oled-sc.html
https://community.hortonworks.com/articles/173818/hdp-264-hdf-31-apache-spark-streaming-integration.html
https://community.hortonworks.com/articles/174105/hdp-264-hdf-31-apache-spark-structured-streaming-i.html
https://community.hortonworks.com/articles/174538/apache-deep-learning-101-using-apache-mxnet-with-h.html
https://community.hortonworks.com/articles/177137/ingesting-flight-data-ads-b-usb-receiver-with-apac.html
https://community.hortonworks.com/articles/177256/spring-boot-20-on-acid-integrating-rest-microservi.html
https://community.hortonworks.com/articles/177301/big-data-devops-apache-nifi-flow-versioning-and-au.html
https://community.hortonworks.com/articles/178196/integrating-lucene-geo-gazetteer-for-geo-parsing-w.html
https://community.hortonworks.com/articles/178510/integration-apache-opennlp-184-into-apache-nifi-15.html
https://community.hortonworks.com/articles/182850/vision-thing.html
https://community.hortonworks.com/articles/182984/vision-thing-part-2-processing-capturing-and-displ.html
https://community.hortonworks.com/articles/183151/enterprise-iiot-edge-processing-with-apache-nifi-m.html
https://community.hortonworks.com/articles/183217/devops-backing-up-apache-nifi-registry-flows.html
https://community.hortonworks.com/articles/183474/iot-using-minifi-java-agent-to-send-mqtt-messages.html
https://community.hortonworks.com/articles/185079/publishing-and-consuming-jms-messages-from-tibco-e.html
https://community.hortonworks.com/articles/189514/converting-csv-files-to-apache-hive-tables-with-ap.html
https://community.hortonworks.com/articles/189630/tracking-air-quality-with-hdp-and-hdfi-part-1-apac.html
https://community.hortonworks.com/articles/189735/automating-social-media-sending-tweets-with-apache.html
https://community.hortonworks.com/articles/190765/processing-real-time-social-media-twitter-with-apa.html
https://community.hortonworks.com/articles/191146/accessing-feeds-from-etherdelta-on-trades-funds-bu.html
https://community.hortonworks.com/articles/191255/ethereum-accessing-feeds-from-etherscan-on-volume.html
https://community.hortonworks.com/articles/191658/devops-tips-using-the-apache-nifi-toolkit-with-apa.html
https://community.hortonworks.com/articles/196963/scanning-documents-into-data-lakes-via-tesseract-p.html
https://community.hortonworks.com/articles/199566/ingesting-infura-rest-apis-to-access-the-ethereum.html
https://community.hortonworks.com/articles/199570/ingest-btccom-and-blockchaincom-data-via-apache-ni.html
https://community.hortonworks.com/articles/219777/iot-edge-processing-with-deep-learning-on-hdf-32-a-3.html
https://community.hortonworks.com/articles/215271/iot-edge-processing-with-deep-learning-on-hdf-32-a-2.html
https://community.hortonworks.com/articles/224556/building-a-custom-apache-nifi-operations-dashboard-1.html
https://community.hortonworks.com/articles/224554/building-a-custom-apache-nifi-operations-dashboard.html
https://community.hortonworks.com/articles/222367/using-apache-nifi-with-apache-mxnet-gluoncv-for-yo.html
https://community.hortonworks.com/articles/223916/posting-images-with-apache-nifi-17-and-a-custom-pr.html
https://community.hortonworks.com/articles/202236/integrating-apache-mxnet-model-server-with-apache.html
https://community.hortonworks.com/articles/191259/integrating-darknet-yolov3-into-apache-nifi-workfl.html
https://community.hortonworks.com/articles/215258/iot-edge-processing-with-deep-learning-on-hdf-32-a-1.html
https://community.hortonworks.com/articles/177349/big-data-devops-apache-nifi-hwx-schema-registry-sc.html
https://community.hortonworks.com/articles/227560/real-time-stock-processing-with-apache-nifi-and-ap.html
https://community.hortonworks.com/articles/228874/iot-edge-use-cases-with-apache-kafka-and-apache-ni.html
https://community.hortonworks.com/content/kbentry/229215/apache-nifi-processor-for-apache-mxnet-ssd-single.html
https://community.hortonworks.com/articles/229305/using-apache-nifi-for-speech-processing-speech-to.html https://community.hortonworks.com/articles/229522/iot-series-sensors-utilizing-breakout-garden-hat-p.html https://community.hortonworks.com/articles/232136/iot-series-sensors-utilizing-breakout-garden-hat-p-1.html
I wrote a few custom processors in Java for Apache NiFi (all work in NiFi 1.8.0):
https://github.com/tspannhw/nifi-mxnetinference-processor
https://github.com/tspannhw/nifi-extracttext-processor
https://github.com/tspannhw/nifi-langdetect-processor
https://github.com/tspannhw/nifi-attributecleaner-processor
https://github.com/tspannhw/nifi-convertjsontoddl-processor
https://github.com/tspannhw/nifi-postimage-processor
https://github.com/tspannhw/GetWebCamera
https://github.com/tspannhw/nifi-imageextractor-processor
https://github.com/tspannhw/nifi-puttwitter-processor
https://github.com/tspannhw/nifi-tensorflow-processor
A wrote a few other scripts and utilities outside of NiFi as well:
https://github.com/tspannhw/nifi-registry-github
https://github.com/tspannhw/python-scripts
https://github.com/tspannhw/operations-dashboard
https://github.com/tspannhw/stocks-nifi-kafka
https://github.com/tspannhw/ApacheDeepLearning201
https://github.com/tspannhw/nifi-smartplug
https://github.com/tspannhw/yolo3-keras-tensorflow
https://github.com/tspannhw/TensorflowOnYARN
https://github.com/tspannhw/nifi-gluoncv-yolo3
https://github.com/tspannhw/ApacheDeepLearning101
https://github.com/tspannhw/StrataNYC2018
https://github.com/tspannhw/UsingGluonCV
https://github.com/tspannhw/OpenSourceComputerVision
https://github.com/tspannhw/DWS-DeepLearning-CrashCourse
https://github.com/tspannhw/IoTFusion2018Talk
https://github.com/tspannhw/nifi-tesseract-python
https://github.com/tspannhw/nifi-yolo3
https://github.com/tspannhw/mxnet-for-iot
https://github.com/tspannhw/livysparkjob
I started to do some video content in 2018, look for more of this than you will ever want to watch coming in 2019.
https://www.youtube.com/watch?v=u4NZHBDyf54&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=5&t=10s
https://www.youtube.com/watch?v=5w6rV7562xM&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=7&t=1497s
https://www.youtube.com/watch?v=bOfSnNVum_M&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=8&t=409s
https://www.youtube.com/watch?v=ksDKNp6Z4BE&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=9&t=39s
https://www.youtube.com/watch?v=J23wgoIknP0&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=10&t=0s
https://www.youtube.com/watch?v=uU9HO3SWbOs&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=11&t=158s
https://www.youtube.com/watch?v=N0NLJo5y7RQ&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=13&t=1478s
https://www.youtube.com/watch?v=Q4dSGPvqXSA&list=PL-7XqvSmQqfTEtNbnITDrgycfn3uny8LM&index=16&t=0s
The Technology of 2018 That I most Used Apache NiFi Apache Kafka NiFi Registry MiniFi Apache Spark Hortonworks SMM Hortonworks Data Plane Services Apache MXNet TensorFlow Hadoop 3.1 HDP 3.1 and HDF 3.3 Apache Hive Apache Ranger Apache HBase Apache Phoenix Apache Druid Spring Boot Python 3.7 YOLO v3 MXNet GluonCV Kubernetes Hybrid Cloud Blockchain and Cryptocurrencies For a quick spin of the best of 2018, see here: https://community.hortonworks.com/articles/151939/hdp-securitygovernance-demo-kit.html I'll see you in 2019 as part of Cloudera with lots of new IoT, devices, AI, Deep Learning, NiFi, Kafka, Streaming and more. !!! Happy Holidays !!! References 2017: https://community.hortonworks.com/content/kbentry/155338/my-year-in-review-2017.html https://dzone.com/articles/dataworks-summit-2018-berlin-apache-nifi-wrapup https://dzone.com/articles/2018-the-year-in-big-data
... View more
12-14-2018
06:45 PM
2 Kudos
IoT Series: Sensors: Utilizing Breakout Garden Hat: Part 1 - Introduction An easy option for adding, removing and prototype sensor reads from a standard Raspberry Pi with no special wiring. Hardware Component List: Raspberry Pi USB Power Cable Pimoroni Breakout Garden Hat 1.12" Mono OLED Breakout 128x128 White/Black Screen BME680 Air Quality, Temperature, Pressure, Humidity Sensor LWM303D 6D0F Motion Sensor (X, Y, Z Axes) BH1745 Luminance and Color Sensor LTR-559 Light and Proximity Sensor 0.01 lux to 64,000 lux VL53L1X Time of Flight (TOF) Sensor Pew Pew Lasers! Software Component List: Raspian Python 2.7 JDK 8 Java Apache NiFi MiniFi Source Code: https://github.com/tspannhw/minifi-breakoutgarden
Shell Script (https://github.com/tspannhw/minifi-breakoutgarden/blob/master/runbrk.sh) Python (https://github.com/tspannhw/minifi-breakoutgarden/blob/master/brk.py) Summary Our Raspberry Pi has a Breakout Garden Hat with 5 sensors and one small display. The display is showing the last reading and is constantly updating. For debugging purposes, it shows the IP Address so I can connect as needed. We currently run via nohup, but when we go into constant use I will switch to a Linux Service to run on startup. The Python script initializes the connections to all of the sensors and then goes into an infinite loop of reading those values and building a JSON packet that we send via TCP/IP over port 5005 to a listener. MiniFi 0.5.0 Java Agent is using ListenTCP on that port to capture these messages and filter them based on alarm values. If outside of the checked parameters we send them via S2S/HTTP(s) to an Apache NiFi server. We also have a USB WebCam (Sony Playstation 3 EYE) that is capturing images and we read those with MiniFi and send them to NiFi as well. The first thing we need to do is pretty easy. We need to plug in our Pimoroni Breakout Garden Hat and our 6 plugs. You have to do the standard installation of Python, Java 8, MiniFi and I recommend OpenCV. Make sure you have everything plugged in securely and the correct direction before you power on the Raspberry Pi. Download MiniFi Java Here: https://nifi.apache.org/minifi/download.html Install Python PIP curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py Install Breakout Garden Library wget https://github.com/pimoroni/breakout-garden/archive/master.zip unzip master.zip cd breakout-garden-master sudo ./install.sh Running In NiFi First we build our MiniFi Flow: We have two objectives: listen for TCP/IP JSON messages from our running Python sensor collector and gather images captured by the PS3 Eye USB Webcam. We then add content type and schema information to the attributes. We also extract a few values from the JSON stream to use for alerting. We extract: $.cputemp, $.VL53L1X_distance_in_mm, $.bme680_humidity, $.bme680_tempf These attributes are now attached to our flowfile which is unchanged. We can now Route on them. So we route on a few alarm conditions: ${cputemp:gt(100)} ${humidity:gt(60)} ${tempf:gt(80)} ${distance:gt(0)} We can easily add more conditions or different set values. We can also populate these set values from an HTTP / file lookup. If these values are met we send to our local Apache NiFi router. This can then do further analysis with the fuller NiFi processor set including TensorFlow, MXNet, Record processing and lookups. Local NiFi Routing For now we are just splitting up the images and JSON and sending to two different remote ports on our cloud NiFi cluster. These then arrive in the cloud. As you can see a list of the flow files waiting to be processed (I haven't written that part yet). As you can see we are getting a few a second, we could get 100,000 a second if we needed. Just add nodes. Instant scaling. Cloudbreak can do that for you. In part 2, we will start processing these data streams and images. We will also add Apache MXNet and TensorFlow at various points on the edge, router and cloud using Python and built-in Deep Learning NiFi processors I have authored. We will also break apart these records and send each sensor to it's own Kafka topic to be processed with Kafka Streams, Druid, Hive and HBase. As part of our loop we write to our little screen current values: Example Record {
"systemtime" : "12/19/2018 22:15:56",
"BH1745_green" : "4.0",
"ltr559_prox" : "0000",
"end" : "1545275756.7",
"uuid" : "20181220031556_e54721d6-6110-40a6-aa5c-72dbd8a8dcb2",
"lsm303d_accelerometer" : "+00.06g : -01.01g : +00.04g",
"imgnamep" : "images/bog_image_p_20181220031556_e54721d6-6110-40a6-aa5c-72dbd8a8dcb2.jpg",
"cputemp" : 51.0,
"BH1745_blue" : "9.0",
"te" : "47.3427119255",
"bme680_tempc" : "28.19",
"imgname" : "images/bog_image_20181220031556_e54721d6-6110-40a6-aa5c-72dbd8a8dcb2.jpg",
"bme680_tempf" : "82.74",
"ltr559_lux" : "006.87",
"memory" : 34.9,
"VL53L1X_distance_in_mm" : 134,
"bme680_humidity" : "23.938",
"host" : "vid5",
"diskusage" : "8732.7",
"ipaddress" : "192.168.1.167",
"bme680_pressure" : "1017.31",
"BH1745_clear" : "10.0",
"BH1745_red" : "0.0",
"lsm303d_magnetometer" : "+00.04 : +00.34 : -00.10",
"starttime" : "12/19/2018 22:15:09"
}
NiFi Templates
nifi-garden-router.xml minifi-garden.xml garden-server.xml Let's Build Those Topics Now /usr/hdp/current/kafka-broker/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic bme680
/usr/hdp/current/kafka-broker/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic bh17455
/usr/hdp/current/kafka-broker/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic lsm303d
/usr/hdp/current/kafka-broker/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic vl53l1x
/usr/hdp/current/kafka-broker/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic ltr559
Hopefully in your environment, you will be able to have 3, 5 or 7 replication factor and many partitions. I have one Kafka Broker so this is what we are starting with. Reference https://shop.pimoroni.com/collections/breakout-garden https://github.com/pimoroni/breakout-garden/tree/master/examples https://shop.pimoroni.com/products/breakout-garden-hat https://github.com/pimoroni/bme680-python https://learn.pimoroni.com/bme680 https://shop.pimoroni.com/products/bh1745-luminance-and-colour-sensor-breakout https://github.com/pimoroni/bh1745-python https://shop.pimoroni.com/products/vl53l1x-breakout https://github.com/pimoroni/vl53l1x-python/tree/master/examples https://shop.pimoroni.com/products/ltr-559-light-proximity-sensor-breakout https://github.com/pimoroni/ltr559-python https://shop.pimoroni.com/products/bme680-breakout https://github.com/pimoroni/bme680 https://shop.pimoroni.com/products/lsm303d-6dof-motion-sensor-breakout https://github.com/pimoroni/lsm303d-python https://shop.pimoroni.com/products/1-12-oled-breakout https://github.com/rm-hull/luma.oled http://www.diegoacuna.me/how-to-run-a-script-as-a-service-in-raspberry-pi-raspbian-jessie/
... View more
Labels:
12-12-2018
02:31 AM
My Example Mesh Network Argon: Wi-Fi + Mesh Gateway/Repeater This is to connect to the outside network via WiFi. We connect to Particle Cloud. Espressif ESP32-D0WD 2.4G Wi-Fi Nordic Semiconductor nRF52840 SoC for bluetooth and NFC A tag ARM TrustZone CryptoCell-310 Cryptographic and security module Thread/BLE, another for Wi-Fi) Xenon: Mesh + BLE Nordic Semiconductor nRF52840 SoC for bluetooth and NFC A tag ARM TrustZone CryptoCell-310 Cryptographic and security module References
https://github.com/Seeed-Studio/Grove_Starter_Kit_for_Photon_Demos?files=1 https://docs.particle.io/datasheets/accessories/mesh-accessories/ https://community.particle.io/c/mesh https://www.particle.io/mesh https://docs.particle.io/datasheets/mesh/xenon-datasheet/ https://docs.particle.io/datasheets/wi-fi/argon-datasheet/ https://blog.particle.io/2018/04/28/how-to-build-a-wireless-mesh-network/ WiFi Mesh = https://www.threadgroup.org/ https://github.com/Seeed-Studio/Grove_Starter_Kit_for_Photon_Demos/blob/master/Example%20-%2005%20Measuring%20Temperature/Example05.ino http://wiki.seeedstudio.com/Grove-Ultrasonic_Ranger/ https://www.particle.io/mesh/buy/xenon
... View more
Labels:
12-10-2018
07:41 PM
3 Kudos
Deep Speech with Apache NiFi 1.8
Tools: Python 3.6, PyAudio, TensorFlow, Deep Speech, Shell, Apache NiFi
Why: Speech-to-Text
Use Case: Voice control and recognition.
Series: Holiday Use Case: Turn on Holiday Lights and Music on command. Cool Factor: Ever want to run a query on Live Ingested Voice Commands?
Other Options: https://community.hortonworks.com/articles/155519/voice-controlled-data-flows-with-google-aiy-voice.html
We are using Python 3.6 to write some code around pyaudio, tensorflow and Deep Speech to capture audio, store it in a wave file and then process it with Deep Speech to extract some text. This example is running in OSX without a GPU on Tensorflow v1.11.
The Mozilla Github repo for their Deep Speech implementation has nice getting started information that I used to integrate our flow with Apache NiFi.
Installation as per https://github.com/mozilla/DeepSpeech
pip3 install deepspeech
wget -O - https://github.com/mozilla/DeepSpeech/releases/download/v0.3.0/deepspeech-0.3.0-models.tar.gz | tar xvfz -
This pre-trained model is available for English. For other languages, you will need to build your own. You can use a beef HDP 3.1 cluster to train this. Note: THIS IS A 1.8 GIG DOWNLOAD. That may be an issue for laptops, devices or small data people. Apache NiFi Flow The flow is simple, we call our shell script that runs Python that records audio and sends it to Deep Speech for processing. We get back a voice_string in JSON that we turn into a record for querying and filtering in Apache NiFi. I am handling a few voice commands for "Save", "Load" and "Move". As you can imagine you can handle pretty much anything you want. It's a simple way to use voice to control streaming data flows or just to ingest large streams of text. Even using advanced Deep Learning, text recognition is still not the strongest. If you are going to load balance connections between nodes, you have options on compression and load balancing strategies. This can come in handy if you have a lot of servers. Shell Script
python3.6 /Volumes/TSPANN/projects/DeepSpeech/processnifi.py /Volumes/TSPANN/projects/DeepSpeech/models/output_graph.pbmm /Volumes/TSPANN/projects/DeepSpeech/models/alphabet.txt
Schema
{
"type" : "record",
"name" : "voice",
"fields" : [ {
"name" : "systemtime",
"type" : "string",
"doc" : "Type inferred from '\"12/10/2018 14:53:47\"'"
}, {
"name" : "voice_string",
"type" : "string",
"doc" : "Type inferred from '\"\"'"
} ]
}
We can add more fields as needed.
Example Run
HW13125:DeepSpeech tspann$ ./runnifi.sh
TensorFlow: v1.11.0-9-g97d851f04e
DeepSpeech: unknown
2018-12-10 14:36:43.714433: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
{"systemtime": "12/10/2018 14:36:43", "voice_string": "one two three or five six seven eight nine"}
We can run this on top of YARN 3.1 as dockerized or non-dockerized workloads. Setting up nodes to run HDF 3.3 - Apache NiFi and friends is easy in the cloud or on-premise in OpenStack with super devops tools. When running Apache NiFi it is easy to monitor in Ambari:
References:
https://github.com/mozilla/DeepSpeech
https://community.hortonworks.com/articles/224268/running-tensorflow-on-yarn-31-with-or-without-gpu.html
https://arxiv.org/abs/1412.5567
https://github.com/tspannhw/nifi-deepspeech
... View more
Labels:
12-06-2018
09:53 PM
5 Kudos
Apache NiFi Processor for Apache MXNet SSD: Single Shot MultiBox Object Detector (Deep Learning) The news is out, Apache MXNet has added a Java API. So as soon as I could I got my hands on the maven repo and an example program and got to work writing a new Apache NiFi processor for it. I have run this on standalone Apache NiFi 1.8.0 and on HDF 3.3 - Apache NiFi 1.8.0 and both work. So anyone who wants to be an alpha tester, please download it and give it a try. Apache MXNet SSD is a good example of a pretrained deep learning model that works pretty well for general images in a use cases especially around people and cars. You can fine-tune this with some more images and runs: https://mxnet.incubator.apache.org/faq/finetune.html The nice thing is now we can start including Apache MXNet as part of Java applications such as Kafka Streams, Apache Storm, Apache Spark, Spring Boot and other use cases using Java. I could potentially inject this into a Hive UDF (https://community.hortonworks.com/articles/39980/creating-a-hive-udf-in-java.html#comment-40026) or Pig UDF. The performance may be fast enough. We now have four Java options for Deep Learning: DL4J, H2O, Tensorflow and Apache MXNet. Unfortunately, both TensorFlow and MXNet Java APIs are not quite production ready. I may do some further research on running MXNet as a Hive UDF, it would be cool to have in a query. For those who don't want to setup a development environment with JDK 8+, Maven 3.3+ and git, you can download a pre-built nar file here: https://github.com/tspannhw/nifi-mxnetinference-processor/releases/tag/v1.0. As part of the recent release of HDF 3.3, I have upgraded my OpenStack Centos 7 cluster. Important Caveats Notice, the Java API is in preview and so is this processor. Do not use this in production! This is in development and I am the only one working on it. The Java API from Apache MXNet is in flux and will be changing. See the POM as it is tied to the OSX/Mac version of the library. You will need to change that. You will need to download the pre-built MXNet model and place it in a directory accessible to Apache NiFi server/cluster. I am still cleaning up the rectangle code for identifying objects in the pictures. As you will notice, my rectangle drawing is a bit off. I need to work on that. Once you drop your built nar file and models in the nifi/lib directory and restart Apache NiFi, you can add it to your canvas. We need to feed it some images. You can use my web cam processor, an image URL feed or local files. To grab images from an HTTPS site, you need an SSL Context Service like this StandardSSLContextService below. You will need to point to the cacerts used by the JRE/JDK running your Apache NiFi node. The default password in Java is changeme. Hopefully you have changed it. To configure my new processor, just put in the full path to the model directory and then "/resnet50_ssd_model" as that is the prefix for the model. Our example flow with new processor being fed by traffic cameras, webcams, local files and local webcam. Some output of our flow: Our top 5 probabilities and labels Example Data: {
"ymin_1" : "456.01",
"ymin_5" : "159.29",
"ymin_4" : "235.83",
"ymin_3" : "206.64",
"ymin_2" : "383.84",
"label_5" : "person",
"xmax_5" : "121.14",
"label_4" : "bicycle",
"xmax_4" : "137.89",
"label_3" : "dog",
"xmax_3" : "179.14",
"ymax_1" : "150.66",
"ymax_2" : "418.95",
"ymax_3" : "476.79",
"label_2" : "bicycle",
"label_1" : "car",
"probability_4" : "0.22",
"probability_5" : "0.13",
"probability_2" : "0.90",
"xmin_5" : "88.93",
"probability_3" : "0.82",
"ymax_4" : "413.43",
"probability_1" : "1.00",
"ymax_5" : "190.04",
"xmax_2" : "149.96",
"xmax_1" : "72.03",
"xmin_3" : "83.82",
"xmin_4" : "93.05",
"xmin_1" : "312.21",
"xmin_2" : "155.96"
} Resources: https://medium.com/apache-mxnet/introducing-java-apis-for-deep-learning-inference-with-apache-mxnet-8406a698fa5a https://github.com/apache/incubator-mxnet/tree/java-api/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi https://mxnet.incubator.apache.org/install/java_setup.html Source: https://github.com/tspannhw/nifi-mxnetinference-processor Video walk-through: https://www.youtube.com/watch?v=Q4dSGPvqXSA&t=196s&list=PL-7XqvSmQqfTSihuoIP_ZAnN7mFIHkZ_e&index=17 mxnet-processor.xml Download the artifacts listed: https://github.com/apache/incubator-mxnet/tree/java-api/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector#step-1 Maven POM (I used Java 8 and Maven 3.3.9) <?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.dataflowdeveloper.mxnet</groupId>
<artifactId>inference</artifactId>
<version>1.0</version>
</parent>
<artifactId>nifi-mxnetinference-processors</artifactId>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.apache.nifi</groupId>
<artifactId>nifi-api</artifactId>
</dependency>
<dependency>
<groupId>org.apache.nifi</groupId>
<artifactId>nifi-utils</artifactId>
<version>1.8.0</version>
</dependency>
<dependency>
<groupId>org.apache.nifi</groupId>
<artifactId>nifi-mock</artifactId>
<version>1.8.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.mxnet</groupId>
<artifactId>mxnet-full_2.11-osx-x86_64-cpu</artifactId>
<version>1.3.1-SNAPSHOT</version>
</dependency>
</dependencies>
</project>
I have moved from Eclipse to IntelliJ from my builds. I am looking at Apache Netbeans as well.
... View more
Labels:
11-27-2018
03:50 AM
5 Kudos
MiniFi Java Agent 0.5 Copy over necessary NARs from Apache NiFi 1.7 lib:
nifi-ssl-context-service-nar-1.7.0.nar nifi-standard-services-api-nar-1.7.0.nar nifi-kafka-1-0-nar-1.7.0.nar This will support PublishKafka_1_0 and ConsumeKafka_1_0. Then create a consume and/or publish flow. You can combine the two based on your needs. In my simple example I consume the Kafka messages in MiniFi and write to a file. I also write the metadata to a JSON file. Consume Kafka Publish Electric Monitoring Data To Kafka Let's monitor the messages going through our topic, smartPlug. Publish Messages to Kafka Consume Any Messages From the smartPlug topic Logs Provenance Event file containing 377 records. In the past 5 minutes, 1512 events have been written to the Provenance Repository, totaling 839.32 KB
2018-11-26 19:42:32,473 INFO [main] o.a.n.c.s.StandardProcessScheduler Starting PutFile[id=25a86505-031a-37d9-0000-000000000000]2018-11-26 19:42:32,474 INFO [main] o.a.n.c.s.StandardProcessScheduler Starting UpdateAttribute[id=9220d40d-ee1d-3f61-0000-000000000000]2018-11-26 19:42:32,474 INFO [main] o.apache.nifi.controller.FlowController Started 0 Remote Group Ports transmitting2018-11-26 19:42:32,478 INFO [main] org.apache.nifi.minifi.MiNiFiServer Flow loaded successfully.2018-11-26 19:42:32,479 INFO [Monitor Processor Lifecycle Thread-2] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled ConsumeKafka_1_0[id=8556f1ce-a915-3fda-0000-000000000000] to run with 1 threads2018-11-26 19:42:32,479 INFO [main] org.apache.nifi.BootstrapListener Successfully initiated communication with Bootstrap2018-11-26 19:42:32,479 INFO [Monitor Processor Lifecycle Thread-1] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled AttributesToJSON[id=0628b4e5-10d0-3b09-0000-000000000000] to run with 1 threads2018-11-26 19:42:32,479 INFO [main] org.apache.nifi.minifi.MiNiFi Controller initialization took 2787584123 nanoseconds.2018-11-26 19:42:32,480 INFO [Monitor Processor Lifecycle Thread-1] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled PutFile[id=25a86505-031a-37d9-0000-000000000000] to run with 1 threads2018-11-26 19:42:32,481 INFO [Monitor Processor Lifecycle Thread-2] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled UpdateAttribute[id=9220d40d-ee1d-3f61-0000-000000000000] to run with 1 threads2018-11-26 19:42:32,585 INFO [Timer-Driven Process Thread-2] o.a.k.clients.consumer.ConsumerConfig ConsumerConfig values:auto.commit.interval.ms = 5000auto.offset.reset = latestbootstrap.servers = [princeton1.field.hortonworks.com:6667]check.crcs = trueclient.id =connections.max.idle.ms = 540000enable.auto.commit = falseexclude.internal.topics = truefetch.max.bytes = 52428800fetch.max.wait.ms = 500fetch.min.bytes = 1group.id = minificonsumer1heartbeat.interval.ms = 3000interceptor.classes = nullinternal.leave.group.on.close = trueisolation.level = read_uncommittedkey.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializermax.partition.fetch.bytes = 1048576max.poll.interval.ms = 300000max.poll.records = 10000metadata.max.age.ms = 300000metric.reporters = []metrics.num.samples = 2metrics.recording.level = INFOmetrics.sample.window.ms = 30000partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]receive.buffer.bytes = 65536reconnect.backoff.max.ms = 1000reconnect.backoff.ms = 50request.timeout.ms = 305000retry.backoff.ms = 100sasl.jaas.config = nullsasl.kerberos.kinit.cmd = /usr/bin/kinitsasl.kerberos.min.time.before.relogin = 60000sasl.kerberos.service.name = nullsasl.kerberos.ticket.renew.jitter = 0.05sasl.kerberos.ticket.renew.window.factor = 0.8sasl.mechanism = GSSAPIsecurity.protocol = PLAINTEXTsend.buffer.bytes = 131072session.timeout.ms = 10000ssl.cipher.suites = nullssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]ssl.endpoint.identification.algorithm = nullssl.key.password = nullssl.keymanager.algorithm = SunX509ssl.keystore.location = nullssl.keystore.password = nullssl.keystore.type = JKSssl.protocol = TLSssl.provider = nullssl.secure.random.implementation = nullssl.trustmanager.algorithm = PKIXssl.truststore.location = nullssl.truststore.password = nullssl.truststore.type = JKSvalue.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer2018-11-26 19:42:32,727 INFO [Timer-Driven Process Thread-2] o.a.kafka.common.utils.AppInfoParser Kafka version : 1.0.02018-11-26 19:42:32,727 INFO [Timer-Driven Process Thread-2] o.a.kafka.common.utils.AppInfoParser Kafka commitId : aaa7af6d4a11b29d2018-11-26 19:42:33,088 INFO [Timer-Driven Process Thread-2] o.a.k.c.c.internals.AbstractCoordinator [Consumer clientId=consumer-1, groupId=minificonsumer1] Discovered coordinator princeton1.field.hortonworks.com:6667 (id: 2147482646 rack: null)2018-11-26 19:42:33,090 INFO [Timer-Driven Process Thread-2] o.a.k.c.c.internals.ConsumerCoordinator [Consumer clientId=consumer-1, groupId=minificonsumer1] Revoking previously assigned partitions []2018-11-26 19:42:33,091 INFO [Timer-Driven Process Thread-2] o.a.k.c.c.internals.AbstractCoordinator [Consumer clientId=consumer-1, groupId=minificonsumer1] (Re-)joining group2018-11-26 19:42:36,391 INFO [Timer-Driven Process Thread-2] o.a.k.c.c.internals.AbstractCoordinator [Consumer clientId=consumer-1, groupId=minificonsumer1] Successfully joined group with generation 32018-11-26 19:42:36,394 INFO [Timer-Driven Process Thread-2] o.a.k.c.c.internals.ConsumerCoordinator [Consumer clientId=consumer-1, groupId=minificonsumer1] Setting newly assigned partitions [smartPlug-0]2018-11-26 19:44:32,325 INFO [pool-34-thread-1] o.a.n.c.r.WriteAheadFlowFileRepository Successfully checkpointed FlowFile Repository with 0 records in 0 milliseconds2018-11-26 19:44:40,700 INFO [Provenance Maintenance Thread-1] o.a.n.p.PersistentProvenanceRepository Created new Provenance Event Writers for events starting with ID 14372018-11-26 19:44:40,765 INFO [Provenance Repository Rollover Thread-1] o.a.n.p.lucene.SimpleIndexManager Index Writer for provenance_repository/index-1543271506000 has been returned to Index Manager and is no longer in use. Closing Index Writer2018-11-26 19:44:40,767 INFO [Provenance Repository Rollover Thread-1] o.a.n.p.PersistentProvenanceRepository Successfully merged 16 journal files (28 records) into single Provenance Log File provenance_repository/1409.prov in 62 milliseconds2018-11-26 19:44:40,768 INFO [Provenance Repository Rollover Thread-1] o.a.n.p.PersistentProvenanceRepository Successfully Rolled over Provenance Event file containing 151 records. In the past 5 minutes, 28 events have been written to the Provenance Repository, totaling 15.43 KB JSON Kafka Message and JSON Kafka Metadata Stored As Files monitor/1448678223641638.attr.json {"path":"./","filename":"1448678223641638","kafka.partition":"0","kafka.offset":"5543","kafka.topic":"smartPlug","kafka.key":"cb90ad21-b311-494c-96cc-06dd2e8747df","uuid":"041459fc-c63e-4056-ab50-1c375cd7d49f"} monitor/1448678223641638 {"day30": 0.431, "day31": 1.15, "sw_ver": "1.2.5 Build 171206 Rel.085954", "hw_ver": "1.0", "mac": "50:C7:BF:B1:95:D5", "type": "IOT.SMARTPLUGSWITCH", "hwId": "60FF6B258734EA6880E186F8C96DDC61", "fwId": "00000000000000000000000000000000", "oemId": "FFF22CFF774A0B89F7624BFC6F50D5DE", "dev_name": "Wi-Fi Smart Plug With Energy Monitoring", "model": "HS110(US)", "deviceId": "8006ECB1D454C4428953CB2B34D9292D18A6DB0E", "alias": "Tim", "icon_hash": "", "relay_state": 1, "on_time": 886569, "active_mode": "schedule", "feature": "TIM:ENE", "updating": 0, "rssi": -75, "led_off": 0, "latitude": 40.268216, "longitude": -74.529088, "index": 18, "zone_str": "(UTC-05:00) Eastern Daylight Time (US & Canada)", "tz_str": "EST5EDT,M3.2.0,M11.1.0", "dst_offset": 60, "month10": 1.581, "month11": 30.888, "current": 0.067041, "voltage": 122.151701, "power": 1.277361, "total": 24.289, "time": "11/26/2018 21:54:22", "ledon": true, "systemtime": "11/26/2018 21:54:22"} Resources:
https://blog.ona.io/general/2017/08/30/streaming-ona-data-with-nifi-kafka-druid-and-superset.html https://community.hortonworks.com/articles/193945/social-media-monitoring-with-nifi-hivedruid-integr.html https://community.hortonworks.com/articles/177561/streaming-tweets-with-nifi-kafka-tranquility-druid.html https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/kafka-using-kafka-streams/content/kafka-using-kafka-streams.html https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/minifi-quick-start/content/overview.html https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/minifi-quick-start/content/before_you_begin.html https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/minifi-quick-start/content/installing_minifi_on_linux.html https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/minifi-quick-start/content/using_processors_not_packaged_with_minifi.html?es_p=8055369 https://community.hortonworks.com/articles/227560/real-time-stock-processing-with-apache-nifi-and-ap.html https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/minifi-quick-start/content/using_processors_not_packaged_with_minifi.html?es_p=8055369 Files: consumekafka2.xml pushkafka1.xml configyml consume.txt configymlsend.txt
... View more
Labels: