Category: Crazyflie

This week we have a guest blog post from Javier Burgués. Enjoy!

I would like to introduce you a rather unknown application of the CrazyFlie 2.0 (CF2): chemical sensing. Due to its small form-factor, the CF2 is an ideal platform for carrying out gas sensing missions in hazardous environments inaccessible to terrestrial robots and bigger drones. For example, searching for victims and hazardous gas leaks inside pockets that form within the wreckage of collapsed buildings in the aftermath of an earthquake or explosion.

To evaluate the suitability of the CF2 for these tasks, I developed a custom deck, named the MOX deck, to interface two metal oxide semiconductor (MOX) gas sensors to the CF2. Then, I performed experiments in a large indoor environment (160 m2) with a gas source placed in challenging positions for the drone, for example hidden in the ceiling of the room or inside a power outlet box. From the measurements collected in motion (i.e. without stopping) along a predefined 3D sweeping path that takes around 3 minutes, the CF2 builds a map of the gas distribution and identifies the most likely source location with high accuracy.

1. MOX deck

The MOX deck (Fig. 1a) contains two sockets for 4-pin Taguchi-type (TGS) gas sensors, a temperature/humidity sensor (SHT25, Sensirion AG), a dual-channel digital potentiometer (AD5242BRUZ1M, Analog Devices, and two MOSFET p-type transistors (NX2301P, NEXPERIA). I used TGS 8100 sensors (Figaro Engineering) due to its compatibility with 3.0 V logic, power consumption of only 15 mW (the lowest in the market as of June 2016) and miniaturized form factor (MEMS). Since the sensor heater uses 1.8V, two transistors (one per sensor) reduce the applied power by means of pulse width modulation (PWM). The MOX read-out circuit (Fig. 1b) is a voltage divider connected to the μC’s analog-to-digital converter (ADC). The voltage divider is powered at 3.0 V and the load resistor (RL) can be set dynamically by the potentiometer (from 60 Ω to 1 MΩ in steps of 3.9 kΩ). Dynamic configuration of the load resistor is important in MOX gas sensors due to the large dynamic range of the sensor resistance (several orders of magnitude) when exposed to different gas concentrations. The sensors were calibrated (by exposing them to several known concentrations) to convert the raw output into parts-per-million (ppm) concentration units.

The initialization task of the deck driver configures the PWM, initializes the SHT25 sensor, sets the wiper position of both channels of the potentiometer and adds the MOX readout registers to the list of variables that are continuously logged and transmitted to the base station. The main task of the deck driver reads the MOX sensor output voltage and the temperature/humidity values from the SHT25 and sends them to the ground station at 10 Hz.

2. Experimental Arena, External Localization System and Gas Source

Experiments were performed in a large robotics laboratory (160 m2 × 2.7 m height) at Örebro University (Sweden). The laboratory is divided into three connected areas (R1–R3) of 132 m2 and a contiguous room (R4) of 28 m2 (Fig. 2). To obtain the 3D position of the drone, I used the Loco positioning system (LPS) from Bitcraze, based on ultra-wide band (UWB) radio transmitters. Six LPS anchors were positioned in known locations of the experimental arena and one LPS tag was fixed to the drone. The six LPS anchors were placed in the central area of the laboratory, shaped in two inverted triangles (below and above the flight area).

A gas leak was emulated by placing a small beaker filled with 200 mL of ethanol 96% in different locations of the arena (Fig. 4). Ethanol was used because it is non-toxic and easily detectable by MOX sensors. Two experiments were carried out to check the viability of the proposed system for gas source localization and mapping in complex environments. In the first experiment, the gas source was placed on top of a table (height = 1 m) in the small room (R4). In the second experiment, the source was placed inside the suspended ceiling (height = 2.7 m) near the entrance to the lab (R1). Since the piping system of the lab runs through the suspended ceiling, the gas source could represent a leak in one of the pipes. A 12 V DC fan (Model: AD0612HB-A70GL, ADDA Corp., Taiwan) was placed behind the beaker to facilitate dispersion of the chemicals in the environment, creating a plume. The experiments started five minutes after setting up the source and turning on the DC fan.

3. Navigation strategy

The drone was sent to fly along a predefined sweeping path consisting of two 2D rectangular sweepings at different heights (0.9 m and 1.8 m), collecting measurements in motion (Fig. 5). These two heights divide the vertical space of the lab in three parts of equal size. Flying first at a lower altitude minimizes the impact of the propellers’ downwash in the gas distribution. For safety reasons, the trajectory was designed to ensure enough clearance around obstacles and walls, and people working inside the laboratory were told to remain in their seats during the experiments. The ground station communicates the flight path to the drone as a sequence of (x,y,z) waypoints, with a target flight speed of 1.0 m/s. The CF2 reports the measured concentration and its location to the ground station every 100 ms.

At the end of the exploration, the ground station uses all the received information to compute a 3D map of the instantaneous concentration and the ’bouts’. A ’bout’ is declared when the derivative of the sensor response exceeds a certain threshold. Bouts are produced by contact with individual gas patches and some authors use them instead of the instantaneous response (which is more affected by the slow response time of chemical sensors). For gas source localization, we compare two approaches: using the cell with maximum value in the concentration map or using the cell with maximum bout frequency. The bout frequency (bouts/min) is computed as the bout count in a 5 second sliding window multiplied by 12 (to convert it to bouts/min).

4. Results

In the first experiment, the drone took off near the entrance of the lab (R1), 17 meters downwind of a gas source located in the other end of the laboratory (R4). From the gas distribution map (Fig. 6a) it is evident that the gas source must be in R4, because the maximum concentration (35 ppm) was found there while concentrations below 5 ppm were measured in the rest of the lab. The gas plume can be outlined from the location of odor hits. The highest odor hit density (25 hits/min) was found also in R4. The cells corresponding to the maximum concentration (green start) and maximum odor hit frequency (blue triangle) were found at 0.94 and 1.16 m of the true source location, respectively.

In the second experiment, the gas source was located just above the starting point of the exploration, hidden in the suspended ceiling (Fig. 7). The resulting maximum concentration in the test room was measured when the drone flew at h=1.8 m, highlighting the importance of sampling in 3D for localization and mapping of elevated gas sources. However, since the source is presumably not directly exposed to the environment, concentrations below 3 ppm were found in most locations of the room, which complicates the gas source localization task. The concentration and odor hit maps suggest that the gas source is located in the division between R1 and R2, which represents a localization error of 4.0 and 3.31 m, respectively.

5. Conclusions

These results suggest that the CF2 can be used for gas source localization and mapping in large indoor environments. In contrast to previous works in which long measurement times were taken at predefined or adaptively chosen sampling locations, a rough approximation of such maps can be obtained in very short time with concentration measurements acquired in motion. The obtained gas distribution maps seem coherent with respect to the true source location and wind direction, and not only enable the detection of the source with relatively small localization errors but also provide a rich visual interpretation of the gas distribution.

If you are interested in more details about this work, take a look at the journal paper or drop me an email at <jburgues8 at gmail dot com> or leave a comment on the blog!

The new Crazyflie 2.1

The Crazyflie 2.0 was released almost 4 years ago now. When we released it we wanted to avoid limiting our users in hardware. We over-designed it with lots of features and power we weren’t using at the time of release. We also put in the deck connector so we could keep users updated with new hardware without having to replace their Crazyflies.

Over the years there’s been thousands of users and lots of feedback on the product. Most of it great, but there’s of course also been issues that needed to be addressed. The original design concept is still working with new decks coming out and still free CPU cycles, flash and RAM. So instead of major updates we decided to focus on fixing the issues we’ve seen while keeping backwards compatibility for our users.

So today we’re really excited to announce we’ve released the Crazyflie 2.1! The updated version of the Crazyflie brings improved flight performance, better durability and improved radio stability.

Here’s a list of the updates:

  • Better radio performance and external antenna support: With a new radio power amplifier we’ve improved the link quality and added support for dual antennas (on-board chip antenna and external antenna via u.FL connector)
  • Better power button: We’ve gotten feedback that the power button breaks too easily, so now we’ve replaced with a more sturdy alternative.
  • Improved battery cable fastening: To avoid weakening of the cables over time they now run through a cable relief.
  • Improved sensors: To make the flight performance better we’ve upgrade the IMU and pressure sensor. The new Crazyflie uses the drone specialized sensor combo BMI088 and BMP388 by Bosch Sensortech. It lowers drift and avoids accelerometer saturation which makes the IMU more “trustable”.

It’s important to note that the Crazyflie 2.1 is a drop-in replacement for the Crazyflie 2.0. All spare parts and decks are compatible with both the Crazyflie 2.0 and the 2.1.

We even took it so far that the same binary can be flashed on the Crazyflie 2.0 and 2.1 without any special care. The binary will automatically activate the right drivers which means working with mixed groups of 2.0 and 2.1 isn’t a hassle.

When releasing the Crazyflie 2.1 we’ve also updated all the bundles to contain the new version. But even though you can’t get the bundles with the Crazyflie 2.0, there’s still some Crazyflie 2.0 units left from the last batch that can be purchased in the E-store.

We are glad to announce that we have manufactured the fist batch of Lightouse positioning decks and hopefully it will be ready to ship by the end of the month!

The Lighthouse positioning deck is a Crazyflie 2 deck capable of receiving IR signals from HTC Vive tracking base station (ie. Lighthouses). The basestations works by spinning IR laser beams that are received by the deck to measure the angle at which the base station sees the receiver. This allows the Crazyflie to estimate its position with great accuracy and so to fly autonomously.

The board we produced is very similar architecture-wise to the prototype we showed in previous blog posts. The main physical difference is that we now only have horizontal receivers. This change was made because we do not yet have a satisfactory mechanical solution to mount vertical IR receivers and we arbitrated that horizontal-only sensor already provides great performance for autonomous flight. Functionally it means that the Crazyflie should fly bellow the base stations to be able to position itself, we found that flying ~40cm bellow the base station gave good flying performance. We will continue looking at solution to make a deck with more receiver to increase the flight space in the future.

The lighthouse deck acquires the IR pulses transmitted by the lighthouses, the Crazyflie can then interpret these pulses to estimate its position. We also added soldering pads for a 2.54mm pin header which would allow to interface other microcontroller boards to the deck:

Lighthouse deck architecture

HTC has released 2 versions of the base stations that are incompatible with each other. Version 1 supports 2 base stations per system, and version 2 can support more than 2. We have good initial support for version 1 both in the deck and in the Crazyflie. Version 2 is currently being worked-on but early work shows that the deck should be compatible with version 2 with only a firmware update.

This leads to the current state of the product. The boards have been manufactured and we have received them but they are currently programmed with a test firmware. As previously stated the basic functionality is there but we still don’t have any finished bootloader. As soon as this is finished and tested we will start flashing all the boards. After that is is just a matter of adding them to the web-store stock and they will be ready to ship!

For now we consider this deck as early access, which means that we will document it in the wiki and that the software will still be heavily developed. For example an early limitation that will be worked-on is that it is currently required to run SteamVR on a computer to setup the system, this means that you need to have a full Vive VR setup or at least a vive gamepad or tracker to setup your flight space. Eventually we want to make it possible to setup the system with only base stations and a Crazyflie, without using steamVR.

We have added the deck to our web store so that you can subscribe to get notified as soon as it is in stock, we will of course post on the blog with more informations when this happens. In the mean time we can share again the video we did for the holidays that was made with 3 Crazyflie 2.1 equipped with the lighthouse deck using 2 V1 base stations:

While Crazyflie is nowaday mostly used connected to a computer, we have mobile clients that can be used to fly a Crazyflie using either Bluetooth Low Energy or a Crazyradio with an Android device or an iphone.

The Android client is currently the most advanced one with support for some decks. The goal of these mobile clients is to at least allow to fly a Crazyflie manually, though a lot more could be done by supporting the various decks of the Crazyflie (for example using the flow deck, one might imagine drawing a trajectory on the phone and having the Crazyflie following it :-).

As for development, we have not been very active in the development of the mobile clients and are relying mostly on contributions. So if you are interested into adding functionalities do not hesitate to drop by the Github page of the Android or iOS clients and to propose functionalities and pull requests.

Android client

In 2018, Fred the maintainer of the Android client, has worked hard to stabilize the current app and solve the last few bugs and problem in the current app. A new version was released last week that incorporate all the fixes.

Last years the Android client has seen big internal changes including separating all Crazyflie protocol handling in a separate java library. All these changes will make it easier to implement new functionality in the future and to make the functionality available to android as well as, to any Java program using the Crazyflie java lib.

Iphone client

The iPhone client has seen much less activity in 2018. It has been kept updated with the new versionsd of the Swift language and have seen some bugfixes, all thanks to Github contributors.

There have been reports of a couple of pretty bad bugs that have appeared in the latest release, as soon as these bugs are fixed we plan to release a new version of the iPhone client. The new version will also include the possibility to control the Crazyflie by tilting the phone, and with the bug fixes in place we should be off for a good start of 2019.

Windows client

The Windows UAP Crazyflie client is the least advance of all the mobile Clients. It has the particularity to work on Windows 10 for computers as well as for Phones. This makes it the only implementation of a Bluetooth low energy Crazyflie client for computers. However, Windows 10 for phones being pretty much dead now, the future of this client might be more on the Computer side if any.

Anyway, if anyone is interested in improving the Windows client, we will gladly test and merge pull requests when they come.

2018 has ended, we at Bitcraze are now back from a short holiday break and we are looking forward to 2019. There is already a lot of things rolling that will give results in 2019 and we wanted to do a short post about what we are currently planning.

Product wise, we still have a couple of product in final state of production that we will be releasing during Q1 or early Q2 2019, Crazyflie 2.1 production is on-going and we have started a first batch of the Lighthouse deck.

We have talked about both projects in previous post but if you want to see what the lighthouse positioning is capable of you can look at the Holiday video we pushed two weeks ago:

This video was made using two HTC Vive base station V1 and prototypes of the lighthouse deck we are currently producing. We intend this deck to be the first version of a series of Lighthouse receiver deck: we had to simplify the design by using only horizontal IR receivers in order to be able to produce a first batch now, this meant making some compromises on the usable flight space. We will talk more on that in a future block post but as you can see in the video the system is promising.

We will also try to travel a bit more this year to meet you. IROS 2018 was an awesome experience and allowed us to meet a lot of our users and to get a better understanding on how Crazyflie is and can be used. This year we are aiming at visiting Fosdem 2019 in Brussels as well as exhibiting at ICRA 2019 in Montreal and IROS 2019 in Macau. None of them are completely finalized yet so stay tuned on the blog for future announcement. If you have other suggestion of conferences or event you would like to see us attending, please tell us in the comments or drop us an mail.

Finally on a company side, we are looking at growing the team and changing office. We are currently 5 at Bitcraze which means that we have a lot to do and growing would allow us to expand the Crazyflie ecosystem with more functionality and cool stuff. We are also going to move to a new office where we will have a dedicated flight lab. Until now we have had our office in a co-working space and we used about 4x4m of our office space as a flight lab. In the new office we will have a dedicated 100m² flight space which will allow us to work more on swarm support and to improve the LPS system in a bigger space.

A few weeks ago we wrote about the release of the Multi-ranging deck and the new STEM ranging bundle.

The STEM ranging bundle is a great addition in the classroom for a wide range of students. By combining the Flow deck v2’s time-of-flight distance sensor and optical flow sensor with the Multi-ranger deck’s ability to measure distance to objects, the Crazyflie gets position and spatial awareness.

We have shot a video that shows the bundle in action!

 

To get started with the STEM ranging bundle we have created a guide for the bundle with step-by-step instructions. The code for the demos in the video are available in the example directory of the crazyflie-lib-python project:

  • multiranger_push.py: When the application in launched the Crazyflie will take off and hover. If anything is getting close to the right/left/front/back sensors the Crazyflie will move in the opposite direction. 
  • multiranger_pointcloud.py: When the application is launched the Crazyflie will take off, hover and a 3D-plot will be shown of what is detected by the Multi-ranger deck sensors. By default the left/right/front/back/up sensors will be plotted, but you can also add the Crazyflie position and the down sensor if you like. The Crazyflie can be moved around by using the arrow keys on the keyboard and w/s for up/down and a/d for rotating CCW/CW. For more info see the documentation in the example.

We love feedback so please leave some comments in the field below!

The Crazyflie Z-ranger and Flow decks share one sensor: the VL53 ranging sensor that provides mm-precision by measuring the time of flight of laser pulses. The manufacturer of this sensor has released an improved version, the VL53L1x that works for longer distances compare to the old one. The old sensor worked for distances up to 1 meter while the new one works up to 2 meters.

The Z-ranger deck interfaces a VL53 sensor facing downwards underneath the Crazyflie, it allows to implement very precise altitude-hold by using the ranging to the floor as absolute height.

The Flow deck has both a down-facing VL53 for height measurements as well as an optical flow sensor for position measurements that allows the Crazyflie to hold its height and fly at constant velocity.

We have released both the Z-ranger V2 and Flow V2 which allows to achieve accurate altitude hold and position hold at much higher heights. With the Flow V2 and Z-ranger V2 it is possible to fly almost all the way up to the ceiling in an ordinary room!

Both decks are available in the Bitcraze online store.

We’re happy to announce that the Multiranger and the STEM ranging bundle are now available! The Multiranger deck gives lots of exciting new possibilities when it comes to navigation and classroom activities. One of the features is that you can work with the Crazyflie more without getting into the hardcore control algorithms. Some ideas we’ve had are:

  • Working on algorithms for autonomously driving obstacle courses
  • Scanning rooms and environments and mapping them out (like below)
  • Creating fun applications like airhocky or ping ping where you can play around with the Crazyflie

We’re still working on a nice video for presenting the product (like the STEM bundle video) but until it’s finished here’s a screenshot of using the STEM ranging bundle to map out a small course.

If you want to try out some of the Multiranger deck demos they are available in the example directory of the crazyflie-lib-python project (note they require the Flow deck as well):

  • multiranger_push.py: When the application in launched the Crazyflie will take off and hover. If anything is getting close to the right/left/front/back sensors the Crazyflie will move in the opposite direction. 
  • multiranger_pointcloud.py: When the application is launched the Crazyflie will take off, hover and a 3D-plot will be shown of what is detected by the Multiranger deck sensors. By default the left/right/front/back/up sensors will be plotted, but you can also add the Crazyflie position and the down sensor if you like. The Crazyflie can be moved around by using the arrow keys on the keyboard and w/s for up/down and a/d for rotating CCW/CW. For more info see the documentation in the example.

If you have any other ideas that might be cool to try, make sure to leave them in the comments below!

This week we have a guest blog post from Percy Jaiswal about quad rotor dynamics. Enjoy!

  1. Components
    Although most of us are aware how a quadcopter / drone looks, a generic picture (It’s of a drone called Crazyflie from Bitcraze) of drone is shown above. It consists of 4 motors, control circuitry in middle and Propellers mounted on its rotors. For reasons described in below section, 2 of the rotors rotate in clockwise (CW) direction and remaining 2 in counterclockwise (CCW). CW and CCW motors are placed next to each other to negate Moment (described in next section) generated by them. Propellers come in different configurations like CW or CCW rotating, Pusher or Tractor, with different radius, pitch etcetera.
  2. Force and Moments

    Each rotating propeller produces two kind of forces. When a rotor rotates, it’s propeller produces upward thrust given by F=K_f * ω² (shown by forces F1, F2, F3 and F4 in Figure 2) where ω (omega) is rotation rate of rotor measured in radian / second. Constant K_f depends upon many factors like torque proportionality constant, back-EMF, Density of surrounding air, area swept by propeller etc. The values for K_f​ and K_m (mentioned below)​ are generally found empirically. We mount the motor and propeller on a load cell and measure the force and moment for different motor speeds. Refer “System Identification of the Crazyflie 2.0 Nano Quadrocopter” by Julian Forster for details regarding measurement of K_f and K_m.
    Total upward thrust generated by all 4 propellers is given by summing all individual thrusts generated, for i= 1 to 4 its given by
    F_i = K_f * ω²
    Apart from upward force, a rotating propeller also generates an opposing rotating spin called Torque or Moment (shown by Moments M1, M2, M3 and M4 in Figure 2). For e.g. a rotor spinning in CW direction will produce a torque which causes the body of drone to spin in CCW direction.  A demonstration of this effect can be seen here. This rotating torque is given by M=K_* ​ω²
    Moment generated by a motor is in opposite direction to its spinning, hence CW and CCW spinning motors generate opposite moments. And this is the reason why we have CW and CCW rotating motors so that in steady hover state, moments from 2 CW and 2 CCW rotating rotors negate each other out and drone doesn’t keeping spinning about its body axis (also called yaw).
    Moments / Torques M1, M2, M3 and M4 are moments generated by individual motors. The overall Moment generated around drone’s z axis (Z_b in Figure 2) is given by summation of all 4 moments. Remember that CW and CCW moments will have opposite signs.
    moment_z = M1 + M2 + M3 + M4, again CW and CCW moments will have opposite signs and hence in ideal condition (or whenever we don’t want any Yaw (rotation around z axis) movement) moment_z will be close to 0.
    Contrary to moment_z, overall moment / torque generated around x and y axis’s calculations are little different. Looking at Figure 2, we can see that motor 1 and 3 lie on x axis of drone. So they won’t contribute to any moment / torque around x axis. However we can see that difference in forces generated by motor 2 and 4 will cause drone’s body to tilt around it’s x axis and this is what constitutes overall moment / torque around x axis, which is given by
    moment _x = (F2 — F4) * L, where L is the distance from the axis of rotation of the rotors to the center of the quadrotor. By same logic,
    moment _y = (F3 — F1) * L.
    Summing it up, moment around all 3 axis can be denoted by below vector
    moment = [moment_x, moment_y, moment_z]^T (^T for Transpose)

  3. Orientation and position

    A drone has positional as well as orientational attributes, meaning to say it can be any position (x, y, z coordinates) and can be making certain angles (theta (θ), phi (φ) and psi (ψ)) with respect to world / Inertial frame. Above figure shows theta (θ), phi (φ) and psi (ψ) more clearly.
  4. Moving in z and x & y direction

    Whenever a drone is stationary, it’s in alignment with World frame, meaning to say its Z axis is in same direction as World’s gravitational field. During such a case, if a drone wants to move upwards, it just needs to set proper propeller rotating speed and it can start moving in z direction according to equation total generated force — gravity. However, if it wants to move in x or y direction it first needs to orient itself (making required theta or phi angle). When that happens, total thrust generated by four propellers (F_thrust) has a component in z direction and in x/y direction as shown in above 2D figure. For above shown example, using basic trigonometry, we can find z and y directional force by following equation, where phi is angle made Drone’s body z axis with world Frame.
    F_y ​= F_thrust * ​sinϕ
    F_z ​= F_thrust * ​cosϕ
  5. World and Body frame

    To measure above stated theta, phi and psi angles, usually drone’s onboard IMU sensor is used. This sensor measures how fast drone’s body is rotating around its body frame and provides that angular velocity as its output. When processing this IMU outputs, we need to be careful and understand that angular velocities sent by it are not with respect to World frame, but are with respect to its Body Frame. Above diagram shown both this frames for reference.
  6. Rotation Matrix

    To convert coordinates from Body Frame to World Frame and vice versa, we use a 3×3 matrix called Rotation Matrix. That is, if V is a vector in the world coordinates and V’ is the same vector expressed in the body-fixed coordinates, then the following relations hold:
    V’ = R * V and
    V = R^T * V’ where R is Rotation Matrix and R^T is its transpose.
    To understand this relation completely, let’s begin by understanding rotation in 2D. Let the vector V be rotated by an angle β to get the new vector V′. Let r=|V|. Then, we have the below relations:
    vx = r*cosα and vy = r*sinα
    v’x = r*cos(α+β) and v’y = r*sin(α+β). Expanding this, we get
    v’x = r * (cosα * cosβ — sinα * sinβ) and v’y = r * (sinα * cosβ + cosα * sinβ)
    v’x = vx * cosβ — vy * sinβ and v’y = vy * cosβ + vx * sinβ
    This is exactly what we want because the desired point V’ is described in terms of the original point V and the actual angle β. For conclusion we can write this in matrix notation as

    Going from 2D to 3D is relatively simple in Rotation Matrix’s case. In fact, the 2D matrix we just now derived can actual be thought of as an 3D rotation matrix for rotation around z axis. Hence, for a rotation around z-axis the Rotation Matrix would be

    0, 0, 1 in values in last row and column indicate that z coordinates for rotated point (v’z) is same as original point’s z coordinate (vz). We will call this Z axis Rotation Matrix as Rz(β). Extrapolating same logic to rotations around x and y axis, we can get values for RX(β) and RY(β) as

    And final value for 3D motion Rotation Matrix will just be cross multiplication of above three Rotation Matrices.
    R = Rz(ψ) x Ry(θ) x Rx(φ), where psi (ψ), phi (φ,) and theta (θ) are rotation around z, y, and x axis respectively.

  7. State Vector and its Derivative
    As our drone has 6 degrees of freedom, we usually track it by monitoring this six parameters along with their derivatives (how they are changing with time) to get an accurate estimate of drone’s position and velocity of movement. We do this by maintaining what is often called a state vector X = [x, y, z, φ, θ, ψ, x_dot, y_dot, z_dot, p, q, r] and its derivative X_dot= [x_dot, y_dot, z_dot, θ_dot, φ_dot, ψ_dot, x_doubledot, y_doubledot, z_doubledot, p_dot, q_dot, r_dot] where x, y and z are position of drone in World frame, x_dot, y _dot, and z_dot are positional / linear velocities in World Frame. φ, θ, ψ represent drone attitude / orientation in World frame whereas φ_dot, θ_dot, ψ_dot represents rate of change of this (Euler) angles. p, q, r are angular velocities in body frame whereas p_dot, q_dot and r_dot are its derivate (derivative = rate of change) aka angular acceleration in body frame. x_doubledot, y_doubledot, z_doubledot represents linear accelerations in World Frame.
  8. Linear Acceleration
    As briefed before, whenever propellers are moving, drone will start moving (accelerating) in x, y and z direction depending upon total thrust generated by it’s 4 propellers (represented by Ftotal in below equation) and drone’s orientation (represented by Rotation Matrix R). We know force = mass * acceleration. Ignoring Rotation Matrix R, if we just consider acceleration in z direction, it would be given by
    Z_force = (mass * gravitational force) — (mass * Z_acceleration)
    mass * Z_acceleration = mass * gravitational force — Z_force
    And therefore Z_acceleration = gravitational force — Z_force / mass
    Extrapolating it to x and y direction and including Rotation Matrix (for reasons described in section 4 and 6), equation describing linear acceleration for a drone is given by below equation, where m is mass of drone and g is for gravitational force. Negative sign in F indicates that we are considering gravitational force to be in positive z direction.
  9. Angular acceleration
    Along with linear motion, owing to rotating propellers and its orientation, drone will also have some rotational motion. . While it is convenient to have the linear equations of motion in the inertial / world frame, the rotational equations of motion are useful to us in the body frame, so that we can express rotations about the center of the quadcopter instead of about our inertial center. As mentioned in section 4, we will use drone’s IMU to get its angular accelerations. Let’s consider output from IMU be p, q and r, representing rotational velocities around drone’s x, y and z body axis.

    We derive the rotational equations of motion from Euler’s equations for rigid body dynamics. Expressed in vector form, Euler’s equations are written as

    where ω = [p, q, r]^T is the angular velocity vector, I is the inertia matrix, and moment is a vector of external moment / torques developed in section 2. . Please don’t get confused with usage of ω (as angular velocity) in this section with it’s usage as propeller’s rotation rate. We will stick to usage of ω as rotation rate post this section. We can rewrite above equation as

    Replacing ω with [p, q, r]^T, expanding moment vector and reshuffling above equation we get angular accelerations in body frame as

  10. Rate of change of the Euler angles
    Although drone’s orientation is originally observed in Body frame, we need to convert them to World Frame. Again, we use rotation matrix as per below formula for this purpose. The derivation of this formula is little elongated and is provided in the Reference [6]
  11. Recap
    So to recap what we have learnt so far
    1. A quadcopter has 4 (2 CW and 2 CCW) rotating propellers
    2. Each Propeller creates F =K_f * ω² force in direction perpendicular to its plane and Moment M = K_m * ω² around it’s perpendicular axis.
    3. A drone can be in any x, y, z position and theta (θ), phi (φ) and psi (Ψ) orientation.
    4. When a drone wants to move in z direction (in World Frame) it needs to generate appropriate force (total thrust divided by 4) on each propeller. When it wants to move in either x or y direction (again World Frame), it makes respecting theta / phi angle along with generating required force
    5. When tracking drone’s motion, we need to handle data in World and Body Frames
    6. To convert angular data from Body Frame to World Frame, a Rotational Matrix is used
    7. To track drone’s movements, we keep track of its state vector X and its derivative X_dot
    8. Rotating propellers generate linear accelerations in x, y and z direction as per equation shown in section 8
    9. Rotating propellers generate angular accelerations around z, y and z axis in Body frame as per equation shown in section 9
    10. We convert angular velocities in Body Frame to World Frame Euler angle velocities as per equation shown in section 10.
  12. References
    I don’t want to just list down references, but instead would like to sincerely thank individual authors for their work, without which this article and the understanding which I have gained for drone dynamics would have been almost impossible.
    1. System Identification of the Crazyflie 2.0 Nano Quadrocopter by Julian Forster — http://mikehamer.info/assets/papers/Crazyflie%20Modelling.pdf
    2. Trajectory Generation and Control for Quadrotors by Daniel Warren Mellinger — https://repository.upenn.edu/cgi/viewcontent.cgi?article=1705&context=edissertations
    3. Quadcopter Dynamics, Simulation, and Control by Andrew Gibiansky — http://andrew.gibiansky.com/downloads/pdf/Quadcopter%20Dynamics,%20Simulation,%20and%20Control.pdf
    4. A short derivation to basic rotation around the x-, y- or z-axis — http://www.sunshine2k.de/articles/RotationDerivation.pdf
    5. How do you derive the rotation matrices? — Quora — https://www.quora.com/How-do-you-derive-the-rotation-matrices
    6. Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors by James Diebel — https://www.astro.rug.nl/software/kapteyn/_downloads/attitude.pdf

I would like to sincerely thanks Bitcraze team for allowing me to express myself on their platform. If you liked this post, Follow, Like, Retweet it on Twitter, it will act as encouragement for writing new posts as I continue my journey in becoming a complete Drone engineer.

Till next time….cheers!!