1973
Posts
1225
Kudos Received
124
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2484 | 04-03-2024 06:39 AM | |
| 3829 | 01-12-2024 08:19 AM | |
| 2075 | 12-07-2023 01:49 PM | |
| 3061 | 08-02-2023 07:30 AM | |
| 4195 | 03-29-2023 01:22 PM |
03-06-2018
05:02 PM
Check HCC for articles on connecting NiFi to Secure Phoenix. You must make sure you have permissions to the keytabs from NiFi
... View more
03-05-2018
07:38 PM
3 Kudos
This is for people preparing to attend my talk on Deep Learning at DataWorks Summit Berling 2018 (https://dataworkssummit.com/berlin-2018/#agenda) on Thursday April 19, 2018 at 11:50AM Berlin time. This is for running Apache MXNet on a Raspberry Pi. Let's get this installed! git clone https://github.com/apache/incubator-mxnet.git The installation instructions at Apache MXNet's website (http://mxnet.incubator.apache.org/install/index.html) are amazing. Pick your platform and your style. I am doing this the simplest way on Linux path. Installation: This builds on previous builds, so see those articles. We installed the drivers for Sense Hat, Intel Movidius and the USB Web Cam previously. Please note that versions for Raspberry Pi, Apache MXNet, Python and other drivers are updated every few months so if you are reading this post DWS 2018 you should check the relevant libraries and update to the latest versions. You need Python, Python Devel and PIP installed and you may need to run as root. You will also need OpenCV installed as mentioned in the previous article. In this combined Python script we grab Sense-Hat sensors for temperature, humidity and more. We also run Movidius image analysis and Apache MXNet Inception on the image that we capture with our web cam. Apache MXNet is now in version 1.1, so you may want to upgrade. pip install --upgrade pip
pip install scikit-image
git clone https://github.com/tspannhw/mxnet_rpi.git
sudo apt-get update -y
sudo apt-get install python-pip python-opencv python-scipy python-picamera -y
sudo apt-get -y install git cmake build-essential g++-4.8 c++-4.8 liblapack* libblas* libopencv*
git clone --recursive https://github.com/apache/incubator-mxnet.git mxnet --branch 1.0.0
cd incubator-mxnet
export USE_OPENCV = 0
make
cd python
pip install --upgrade pip
pip install -e .
pip install mxnet==1.0.0
MiniFi Flow to Run Python Script and Send Over Images (Running on Raspberry Pi) Routing on Server to Process Either an Image or a JSON Our Apache NiFi Server Receiving Input from Raspberry Pi Apache NiFi Server Processing The Input We route to two different processing flows, with one for saving images, the other adds a schema and converts the JSON data into Apache AVRO. The AVRO content is merged and we send that to a central HDF 3.1 cluster that can write to HDFS. We can either stream to an ACID Hive table or convert AVRO to Apache ORC and store it to HDFS and autogenerate an external Hive table on top of it. You can find many examples of both of these processes in my links below. We could also insert into Apache HBase or insert into an Apache Phoenix table. Or do all of those and send it to Slack, Email, Store in an RDBMS like MySQL and anything else you could think of. Generated Schema Running: We are using Apache MiniFi Java Agent 0.3.0. I will be adding a follow up including MiniFi 0.40 with the native C++ TensorFlow and USB Cam. See this awesome article for TensorFlow: https://community.hortonworks.com/articles/174520/minifi-c-iot-cat-sensor.html Source Code: https://github.com/tspannhw/rpi-mxnet-movidius-minifi This is too easy! References: https://github.com/tspannhw/ApacheBigData101/ https://community.hortonworks.com/articles/171960/using-apache-mxnet-on-an-apache-nifi-15-instance-w.html https://community.hortonworks.com/articles/174227/apache-deep-learning-101-using-apache-mxnet-on-an.html https://community.hortonworks.com/articles/171960/using-apache-mxnet-on-an-apache-nifi-15-instance-w.html https://community.hortonworks.com/articles/174227/apache-deep-learning-101-using-apache-mxnet-on-an.html https://community.hortonworks.com/articles/174399/apache-deep-learning-101-using-apache-mxnet-on-apa.html https://community.hortonworks.com/articles/176784/deep-learning-101-using-apache-mxnet-in-dsx-notebo.html https://community.hortonworks.com/articles/176789/apache-deep-learning-101-using-apache-mxnet-in-apa.html https://community.hortonworks.com/articles/174538/apache-deep-learning-101-using-apache-mxnet-with-h.html https://community.hortonworks.com/articles/83100/deep-learning-iot-workflows-with-raspberry-pi-mqtt.html https://community.hortonworks.com/articles/167193/building-and-running-minifi-cpp-in-orangepi-zero.html https://community.hortonworks.com/articles/118132/minifi-capturing-converting-tensorflow-inception-t.html https://community.hortonworks.com/articles/130814/sensors-and-image-capture-and-deep-learning-analys.html https://community.hortonworks.com/articles/83100/deep-learning-iot-workflows-with-raspberry-pi-mqtt.html https://community.hortonworks.com/articles/155475/powering-apache-minifi-flows-with-a-movidius-neura.html http://mxnet.incubator.apache.org/install/index.html https://mxnet.incubator.apache.org/tutorials/embedded/wine_detector.html https://github.com/tspannhw/ApacheBigData101 https://github.com/tspannhw/mxnet-in-notebooks https://github.com/tspannhw/nifi-mxnet-yarn/ https://github.com/tspannhw/nvidiajetsontx1-mxnet https://github.com/tspannhw/mxnet_rpi https://github.com/tspannhw/rpi-sensehat-minifi-python/ https://github.com/tspannhw/rpi-minifi-movidius-sensehat
... View more
03-05-2018
03:51 PM
we are running inception see here: /tensorflow/models/tutorials/image/imagenet/classify_image.py
... View more
03-02-2018
11:42 PM
which version of TensorFlow needs to be installed? Can you link the installation document?
... View more
03-02-2018
05:32 PM
3 Kudos
This is for people preparing to attend my talk on Deep Learning at DataWorks Summit Berling 2018 (https://dataworkssummit.com/berlin-2018/#agenda) on Thursday April 19, 2018 at 11:50AM Berlin time. Another way to work with Apache MXNet is by using your Apache Zeppelin notebook to run your Python deep learning scripts. Apache Zeppelin Notebook As you can see we can format that data as a table using Apache Zeppelin display technology. Use this print statement: print("%table top1pct\ttop1\top2\ttop2pct\ttop3pct\ttop3\ttop4pct\ttop4\ttop5pct\ttop5\timagefilename\truntime\tuuid\n" + top1pct + "\t" + top1 + "\t" + top2pct + "\t" + top2 + "\t" + top3pct + "\t" + top3 + "\t" + top4pct + "\t" + top4 + "\t" + top5pct + "\t" + top5 + "\t" + filename + "\t" + str(round(end - start)) + "\t" + uniqueid + "\n" ) We use the pyspark interpreter to run this Python script, but there's no Spark in here yet. This data also gets loaded in Apache Hive via Apache NiFi as shown here: Deep Learning Models You will need to download the pre-built Inception models and reference them on your server. synset.txt Inception-BN-0000.params Inception-BN-symbol.json See: https://mxnet.incubator.apache.org/tutorials/embedded/wine_detector.html curl --header 'Host: data.mxnet.io' --header 'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:45.0) Gecko/20100101 Firefox/45.0' --header 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8' --header 'Accept-Language: en-US,en;q=0.5' --header 'Referer: http://data.mxnet.io/models/imagenet/' --header 'Connection: keep-alive' 'http://data.mxnet.io/models/imagenet/inception-bn.tar.gz' -o 'inception-bn.tar.gz' -L
curl http://data.mxnet.io/models/imagenet/synset.txt More Models http://data.mxnet.io/models/imagenet/ Source Code https://github.com/tspannhw/mxnet-in-notebooks https://github.com/tspannhw/ApacheBigData101 References If you want to run in DSX or Jupyter: https://community.hortonworks.com/articles/176784/deep-learning-101-using-apache-mxnet-in-dsx-notebo.html Setup If you need to setup Apache MXNet on HDF: https://community.hortonworks.com/articles/174227/apache-deep-learning-101-using-apache-mxnet-on-an.html Other Articles in The Series https://community.hortonworks.com/articles/174538/apache-deep-learning-101-using-apache-mxnet-with-h.html https://community.hortonworks.com/articles/174399/apache-deep-learning-101-using-apache-mxnet-on-apa.html https://community.hortonworks.com/articles/155435/using-the-new-mxnet-model-server.html https://community.hortonworks.com/articles/171960/using-apache-mxnet-on-an-apache-nifi-15-instance-w.html
... View more
Labels:
03-02-2018
04:47 PM
4 Kudos
This is for people preparing to attend my talk on Deep Learning at DataWorks Summit Berling 2018 (https://dataworkssummit.com/berlin-2018/#agenda) on Thursday April 19, 2018 at 11:50AM Berlin time. Many people are using IBM's excellent DSX platform which uses Jupyter Notebooks and the ever popular Kubernetes. I wanted to try out Apache MXNet in this environment. It's great. Create or reuse an existing notebook. For Python, the default is Jupyter. Zeppelin is now also supported. I am using Python 2.7 with DSX Desktop on an OSX workstation. This supports Apache MXNet. My local Apache MXNet installation and MXNet python installation worked well with DSX. I needed OpenCV for this example, so I was able to install right inside IBM DSX via !pip install --user opencv-python. Very easy to start a notebook and add your code, you get nice syntax coloring. I uploaded the precompiled model Here we can check our list of Python libraries with !pip list --isolated --format=columns. Very easy to run your Apache MXNet code right in a notebook. Easy to share with other data scientists and engineers in your group and others. IBM DSX Assets You will need to download the pre-built Inception model and add that to assets. synset.txt Inception-BN-0000.params Inception-BN-symbol.json See: https://mxnet.incubator.apache.org/tutorials/embedded/wine_detector.html curl --header 'Host: data.mxnet.io' --header 'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:45.0) Gecko/20100101 Firefox/45.0' --header 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8' --header 'Accept-Language: en-US,en;q=0.5' --header 'Referer: http://data.mxnet.io/models/imagenet/' --header 'Connection: keep-alive' 'http://data.mxnet.io/models/imagenet/inception-bn.tar.gz' -o 'inception-bn.tar.gz' -L curl http://data.mxnet.io/models/imagenet/synset.txt More Models http://data.mxnet.io/models/imagenet/ Source Code https://github.com/tspannhw/mxnet-in-notebooks https://github.com/tspannhw/ApacheBigData101
... View more
Labels:
02-27-2018
09:34 PM
yes by default. You can change ports as well in ambari https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.0/bk_installing-nifi/content/ch02s04.html or it could be 8443
... View more
02-27-2018
08:44 PM
1 Kudo
This is for people preparing to attend my talk on Deep Learning at DataWorks Summit Berling 2018 (https://dataworkssummit.com/berlin-2018/#agenda) on Thursday April 19, 2018 at 11:50AM Berlin time. See: https://community.hortonworks.com/content/kbentry/174399/apache-deep-learning-101-using-apache-mxnet-on-apa.html To do proper analytics and provide fast SQL access to our inception data generated by Apache MXNet from our images, we need to land it into Apache Hive Transactional tables. We will use the Apache NiFi PutHiveStreaming processor to insert data into our ACID table at a rapid rate. This only works if you create a transactional table with Apache ORC, see the DDL below. You must also be running a new version of HDP 2.6+ that has ACID turned on. Tip: In HDP 2.6.4, you will need to create and work with Apache Hive ACID tables with Hive. Not sql in Apache Zeppelin, since that is Apache Spark. jdbc(hive) is Apache Hive. See the configuration below to hive CBO and TEZ enabled as well. Ambari View of Hive SQL DDL %jdbc(hive)
CREATE TABLE `inception`(
uuid STRING, top1pct STRING, top1 STRING, top2pct STRING, top2 STRING, top3pct STRING, top3 STRING, top4pct STRING, top4 STRING, top5pct STRING, top5 STRING, imagefilename STRING,
runtime STRING)
CLUSTERED BY ( top1)
INTO 3 BUCKETS
ROW FORMAT SERDE
'org.apache.hadoop.hive.ql.io.orc.OrcSerde'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
TBLPROPERTIES ( 'transactional'='true')
%jdbc(hive)
select * from inception The PutHiveStreaming processor requires that you have a table that is bucketed, uses Apache ORC and you have permissions. See the example above for a table DDL to use. You also need ACID and LLAP enabled on your Apache Hive cluster. Details for PutHiveStreaming Processor An Example Apache MXNet to Hive Streaming View The Hive View 2.0 of the Data Apache Zeppelin Table DDL and Query
... View more
Labels: