Community Articles
Find and share helpful community-sourced technical articles.
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.
Labels (1)
Super Guru

Ingesting Drone Data From DJII Ryze Tello Drones Part 1 - Setup and Practice


In Part 1, we will setup our drone, our communication environment, capture the data and do initial analysis. We will eventually grab live video stream for object detection, real-time flight control and real-time data ingest of photos, videos and sensor readings. We will have Apache NiFi react to live situations facing the drone and have it issue flight commands via UDP.

In this initial section, we will control the drone with Python which can be triggered by NiFi. Apache NiFi will ingest log data that is stored as CSV files on a NiFi node connected to the drone's WiFi. This will eventually move to a dedicated embedded device running MiniFi.

This is a small personal drone with less than 13 minutes of flight time per battery. This is not a commercial drone, but gives you an idea of the what you can do with drones.

Drone Live Communications for Sensor Readings and Drone Control

You must connect to the drone's WiFi, which will be Tello(Something).

Tello IP:

Receive Tello Video Stream
Tello IP:

UDP Server:
UDP PORT:11111

Example Install:

pip3.6 install tellopy
git clone
pip3.6 install av
pip3.6 install opencv-python
pip3.6 install image
python3.6 -m tellopy.examples.video_effect

Example Run Video:

Example Flight Log: Tello-flight log.pdf.

Let's build a quick ingest with Apache NiFi 1.8.

Our first step we use a local Apache NiFi to read the CSV from the Drone rune locally.


We read the CSVs from the Tello logging directory, add a schema definition and query it.


We have a controller for CSV processing. We are using the posted schema and the Jackson CSV processor. We want to ignore the header as it has invalid characters.

We use a QueryRecord to find if the position in Z has changed.

SELECT * FROM FLOWFILE WHERE mvo_pos_z is not null AND CAST(mvo_pos_z as FLOAT) <> 0.0


We also convert from CSV to Apache AVRO format for further processing.

Valid records are sent over HTTP(S) Site-to-Site to a cloud hosted Apache NiFi cluster for further processing to save to an HBase table.


As you can see it's trival to store these records in HBase.


For HBase, our data didn't have a record identifier, so I use the UpdateRecord processor to create one and add it to the data. I updated the schema to have this field (and have a default and allow nulls).


As you can see it's pretty easy to store data to HBase.


{ "type" : "record", "name" : "drone", 
"fields" : [ 
{ "name" : "drone_rec_id", "type" :  [ "string", "null" ], "default": "1000"  },
{ "name" : "mvo_vel_x", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "mvo_vel_y", "type" : ["string","null"], "default": "0.00" },
{ "name" : "mvo_vel_z", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "mvo_pos_x", "type" : ["string","null"], "default": "0.00" }, 
{ "name" : "mvo_pos_y", "type" : ["double","null"], "default": "0.00"}, 
{ "name" : "mvo_pos_z", "type" : ["string","null"], "default": "0.00" }, 
{ "name" : "imu_acc_x", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_acc_y", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_acc_z", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_gyro_x", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_gyro_y", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_gyro_z", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_q0", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_q1", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_q2", "type" : ["double","null"], "default": "0.00" },
{ "name" : "self_q3", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_vg_x", "type" : ["double","null"], "default": "0.00" },
{ "name" : "imu_vg_y", "type" : ["double","null"], "default": "0.00" }, 
{ "name" : "imu_vg_z", "type" : ["double","null"], "default": "0.00" } ] }

The updated schema now has a record id. The original schema derived from the raw data does not.


Store the Data in HBase Table

Soon we will be storing in Kudu, Impala, Hive, Druid and S3.

create 'drone', 'drone'



We are using the TelloPy interface. You need to clone this github and drop in the files from nifi-drone.

Apache NiFi Flows:



Don't have an account?
Version history
Last update:
‎08-17-2019 04:57 AM
Updated by:
Top Kudoed Authors