Category: Crazyflie

While Crazyflie is nowaday mostly used connected to a computer, we have mobile clients that can be used to fly a Crazyflie using either Bluetooth Low Energy or a Crazyradio with an Android device or an iphone.

The Android client is currently the most advanced one with support for some decks. The goal of these mobile clients is to at least allow to fly a Crazyflie manually, though a lot more could be done by supporting the various decks of the Crazyflie (for example using the flow deck, one might imagine drawing a trajectory on the phone and having the Crazyflie following it :-).

As for development, we have not been very active in the development of the mobile clients and are relying mostly on contributions. So if you are interested into adding functionalities do not hesitate to drop by the Github page of the Android or iOS clients and to propose functionalities and pull requests.

Android client

In 2018, Fred the maintainer of the Android client, has worked hard to stabilize the current app and solve the last few bugs and problem in the current app. A new version was released last week that incorporate all the fixes.

Last years the Android client has seen big internal changes including separating all Crazyflie protocol handling in a separate java library. All these changes will make it easier to implement new functionality in the future and to make the functionality available to android as well as, to any Java program using the Crazyflie java lib.

Iphone client

The iPhone client has seen much less activity in 2018. It has been kept updated with the new versionsd of the Swift language and have seen some bugfixes, all thanks to Github contributors.

There have been reports of a couple of pretty bad bugs that have appeared in the latest release, as soon as these bugs are fixed we plan to release a new version of the iPhone client. The new version will also include the possibility to control the Crazyflie by tilting the phone, and with the bug fixes in place we should be off for a good start of 2019.

Windows client

The Windows UAP Crazyflie client is the least advance of all the mobile Clients. It has the particularity to work on Windows 10 for computers as well as for Phones. This makes it the only implementation of a Bluetooth low energy Crazyflie client for computers. However, Windows 10 for phones being pretty much dead now, the future of this client might be more on the Computer side if any.

Anyway, if anyone is interested in improving the Windows client, we will gladly test and merge pull requests when they come.

2018 has ended, we at Bitcraze are now back from a short holiday break and we are looking forward to 2019. There is already a lot of things rolling that will give results in 2019 and we wanted to do a short post about what we are currently planning.

Product wise, we still have a couple of product in final state of production that we will be releasing during Q1 or early Q2 2019, Crazyflie 2.1 production is on-going and we have started a first batch of the Lighthouse deck.

We have talked about both projects in previous post but if you want to see what the lighthouse positioning is capable of you can look at the Holiday video we pushed two weeks ago:

This video was made using two HTC Vive base station V1 and prototypes of the lighthouse deck we are currently producing. We intend this deck to be the first version of a series of Lighthouse receiver deck: we had to simplify the design by using only horizontal IR receivers in order to be able to produce a first batch now, this meant making some compromises on the usable flight space. We will talk more on that in a future block post but as you can see in the video the system is promising.

We will also try to travel a bit more this year to meet you. IROS 2018 was an awesome experience and allowed us to meet a lot of our users and to get a better understanding on how Crazyflie is and can be used. This year we are aiming at visiting Fosdem 2019 in Brussels as well as exhibiting at ICRA 2019 in Montreal and IROS 2019 in Macau. None of them are completely finalized yet so stay tuned on the blog for future announcement. If you have other suggestion of conferences or event you would like to see us attending, please tell us in the comments or drop us an mail.

Finally on a company side, we are looking at growing the team and changing office. We are currently 5 at Bitcraze which means that we have a lot to do and growing would allow us to expand the Crazyflie ecosystem with more functionality and cool stuff. We are also going to move to a new office where we will have a dedicated flight lab. Until now we have had our office in a co-working space and we used about 4x4m of our office space as a flight lab. In the new office we will have a dedicated 100m² flight space which will allow us to work more on swarm support and to improve the LPS system in a bigger space.

A few weeks ago we wrote about the release of the Multi-ranging deck and the new STEM ranging bundle.

The STEM ranging bundle is a great addition in the classroom for a wide range of students. By combining the Flow deck v2’s time-of-flight distance sensor and optical flow sensor with the Multi-ranger deck’s ability to measure distance to objects, the Crazyflie gets position and spatial awareness.

We have shot a video that shows the bundle in action!

 

To get started with the STEM ranging bundle we have created a guide for the bundle with step-by-step instructions. The code for the demos in the video are available in the example directory of the crazyflie-lib-python project:

  • multiranger_push.py: When the application in launched the Crazyflie will take off and hover. If anything is getting close to the right/left/front/back sensors the Crazyflie will move in the opposite direction. 
  • multiranger_pointcloud.py: When the application is launched the Crazyflie will take off, hover and a 3D-plot will be shown of what is detected by the Multi-ranger deck sensors. By default the left/right/front/back/up sensors will be plotted, but you can also add the Crazyflie position and the down sensor if you like. The Crazyflie can be moved around by using the arrow keys on the keyboard and w/s for up/down and a/d for rotating CCW/CW. For more info see the documentation in the example.

We love feedback so please leave some comments in the field below!

The Crazyflie Z-ranger and Flow decks share one sensor: the VL53 ranging sensor that provides mm-precision by measuring the time of flight of laser pulses. The manufacturer of this sensor has released an improved version, the VL53L1x that works for longer distances compare to the old one. The old sensor worked for distances up to 1 meter while the new one works up to 2 meters.

The Z-ranger deck interfaces a VL53 sensor facing downwards underneath the Crazyflie, it allows to implement very precise altitude-hold by using the ranging to the floor as absolute height.

The Flow deck has both a down-facing VL53 for height measurements as well as an optical flow sensor for position measurements that allows the Crazyflie to hold its height and fly at constant velocity.

We have released both the Z-ranger V2 and Flow V2 which allows to achieve accurate altitude hold and position hold at much higher heights. With the Flow V2 and Z-ranger V2 it is possible to fly almost all the way up to the ceiling in an ordinary room!

Both decks are available in the Bitcraze online store.

We’re happy to announce that the Multiranger and the STEM ranging bundle are now available! The Multiranger deck gives lots of exciting new possibilities when it comes to navigation and classroom activities. One of the features is that you can work with the Crazyflie more without getting into the hardcore control algorithms. Some ideas we’ve had are:

  • Working on algorithms for autonomously driving obstacle courses
  • Scanning rooms and environments and mapping them out (like below)
  • Creating fun applications like airhocky or ping ping where you can play around with the Crazyflie

We’re still working on a nice video for presenting the product (like the STEM bundle video) but until it’s finished here’s a screenshot of using the STEM ranging bundle to map out a small course.

If you want to try out some of the Multiranger deck demos they are available in the example directory of the crazyflie-lib-python project (note they require the Flow deck as well):

  • multiranger_push.py: When the application in launched the Crazyflie will take off and hover. If anything is getting close to the right/left/front/back sensors the Crazyflie will move in the opposite direction. 
  • multiranger_pointcloud.py: When the application is launched the Crazyflie will take off, hover and a 3D-plot will be shown of what is detected by the Multiranger deck sensors. By default the left/right/front/back/up sensors will be plotted, but you can also add the Crazyflie position and the down sensor if you like. The Crazyflie can be moved around by using the arrow keys on the keyboard and w/s for up/down and a/d for rotating CCW/CW. For more info see the documentation in the example.

If you have any other ideas that might be cool to try, make sure to leave them in the comments below!

This week we have a guest blog post from Percy Jaiswal about quad rotor dynamics. Enjoy!

  1. Components
    Although most of us are aware how a quadcopter / drone looks, a generic picture (It’s of a drone called Crazyflie from Bitcraze) of drone is shown above. It consists of 4 motors, control circuitry in middle and Propellers mounted on its rotors. For reasons described in below section, 2 of the rotors rotate in clockwise (CW) direction and remaining 2 in counterclockwise (CCW). CW and CCW motors are placed next to each other to negate Moment (described in next section) generated by them. Propellers come in different configurations like CW or CCW rotating, Pusher or Tractor, with different radius, pitch etcetera.
  2. Force and Moments

    Each rotating propeller produces two kind of forces. When a rotor rotates, it’s propeller produces upward thrust given by F=K_f * ω² (shown by forces F1, F2, F3 and F4 in Figure 2) where ω (omega) is rotation rate of rotor measured in radian / second. Constant K_f depends upon many factors like torque proportionality constant, back-EMF, Density of surrounding air, area swept by propeller etc. The values for K_f​ and K_m (mentioned below)​ are generally found empirically. We mount the motor and propeller on a load cell and measure the force and moment for different motor speeds. Refer “System Identification of the Crazyflie 2.0 Nano Quadrocopter” by Julian Forster for details regarding measurement of K_f and K_m.
    Total upward thrust generated by all 4 propellers is given by summing all individual thrusts generated, for i= 1 to 4 its given by
    F_i = K_f * ω²
    Apart from upward force, a rotating propeller also generates an opposing rotating spin called Torque or Moment (shown by Moments M1, M2, M3 and M4 in Figure 2). For e.g. a rotor spinning in CW direction will produce a torque which causes the body of drone to spin in CCW direction.  A demonstration of this effect can be seen here. This rotating torque is given by M=K_* ​ω²
    Moment generated by a motor is in opposite direction to its spinning, hence CW and CCW spinning motors generate opposite moments. And this is the reason why we have CW and CCW rotating motors so that in steady hover state, moments from 2 CW and 2 CCW rotating rotors negate each other out and drone doesn’t keeping spinning about its body axis (also called yaw).
    Moments / Torques M1, M2, M3 and M4 are moments generated by individual motors. The overall Moment generated around drone’s z axis (Z_b in Figure 2) is given by summation of all 4 moments. Remember that CW and CCW moments will have opposite signs.
    moment_z = M1 + M2 + M3 + M4, again CW and CCW moments will have opposite signs and hence in ideal condition (or whenever we don’t want any Yaw (rotation around z axis) movement) moment_z will be close to 0.
    Contrary to moment_z, overall moment / torque generated around x and y axis’s calculations are little different. Looking at Figure 2, we can see that motor 1 and 3 lie on x axis of drone. So they won’t contribute to any moment / torque around x axis. However we can see that difference in forces generated by motor 2 and 4 will cause drone’s body to tilt around it’s x axis and this is what constitutes overall moment / torque around x axis, which is given by
    moment _x = (F2 — F4) * L, where L is the distance from the axis of rotation of the rotors to the center of the quadrotor. By same logic,
    moment _y = (F3 — F1) * L.
    Summing it up, moment around all 3 axis can be denoted by below vector
    moment = [moment_x, moment_y, moment_z]^T (^T for Transpose)

  3. Orientation and position

    A drone has positional as well as orientational attributes, meaning to say it can be any position (x, y, z coordinates) and can be making certain angles (theta (θ), phi (φ) and psi (ψ)) with respect to world / Inertial frame. Above figure shows theta (θ), phi (φ) and psi (ψ) more clearly.
  4. Moving in z and x & y direction

    Whenever a drone is stationary, it’s in alignment with World frame, meaning to say its Z axis is in same direction as World’s gravitational field. During such a case, if a drone wants to move upwards, it just needs to set proper propeller rotating speed and it can start moving in z direction according to equation total generated force — gravity. However, if it wants to move in x or y direction it first needs to orient itself (making required theta or phi angle). When that happens, total thrust generated by four propellers (F_thrust) has a component in z direction and in x/y direction as shown in above 2D figure. For above shown example, using basic trigonometry, we can find z and y directional force by following equation, where phi is angle made Drone’s body z axis with world Frame.
    F_y ​= F_thrust * ​sinϕ
    F_z ​= F_thrust * ​cosϕ
  5. World and Body frame

    To measure above stated theta, phi and psi angles, usually drone’s onboard IMU sensor is used. This sensor measures how fast drone’s body is rotating around its body frame and provides that angular velocity as its output. When processing this IMU outputs, we need to be careful and understand that angular velocities sent by it are not with respect to World frame, but are with respect to its Body Frame. Above diagram shown both this frames for reference.
  6. Rotation Matrix

    To convert coordinates from Body Frame to World Frame and vice versa, we use a 3×3 matrix called Rotation Matrix. That is, if V is a vector in the world coordinates and V’ is the same vector expressed in the body-fixed coordinates, then the following relations hold:
    V’ = R * V and
    V = R^T * V’ where R is Rotation Matrix and R^T is its transpose.
    To understand this relation completely, let’s begin by understanding rotation in 2D. Let the vector V be rotated by an angle β to get the new vector V′. Let r=|V|. Then, we have the below relations:
    vx = r*cosα and vy = r*sinα
    v’x = r*cos(α+β) and v’y = r*sin(α+β). Expanding this, we get
    v’x = r * (cosα * cosβ — sinα * sinβ) and v’y = r * (sinα * cosβ + cosα * sinβ)
    v’x = vx * cosβ — vy * sinβ and v’y = vy * cosβ + vx * sinβ
    This is exactly what we want because the desired point V’ is described in terms of the original point V and the actual angle β. For conclusion we can write this in matrix notation as

    Going from 2D to 3D is relatively simple in Rotation Matrix’s case. In fact, the 2D matrix we just now derived can actual be thought of as an 3D rotation matrix for rotation around z axis. Hence, for a rotation around z-axis the Rotation Matrix would be

    0, 0, 1 in values in last row and column indicate that z coordinates for rotated point (v’z) is same as original point’s z coordinate (vz). We will call this Z axis Rotation Matrix as Rz(β). Extrapolating same logic to rotations around x and y axis, we can get values for RX(β) and RY(β) as

    And final value for 3D motion Rotation Matrix will just be cross multiplication of above three Rotation Matrices.
    R = Rz(ψ) x Ry(θ) x Rx(φ), where psi (ψ), phi (φ,) and theta (θ) are rotation around z, y, and x axis respectively.

  7. State Vector and its Derivative
    As our drone has 6 degrees of freedom, we usually track it by monitoring this six parameters along with their derivatives (how they are changing with time) to get an accurate estimate of drone’s position and velocity of movement. We do this by maintaining what is often called a state vector X = [x, y, z, φ, θ, ψ, x_dot, y_dot, z_dot, p, q, r] and its derivative X_dot= [x_dot, y_dot, z_dot, θ_dot, φ_dot, ψ_dot, x_doubledot, y_doubledot, z_doubledot, p_dot, q_dot, r_dot] where x, y and z are position of drone in World frame, x_dot, y _dot, and z_dot are positional / linear velocities in World Frame. φ, θ, ψ represent drone attitude / orientation in World frame whereas φ_dot, θ_dot, ψ_dot represents rate of change of this (Euler) angles. p, q, r are angular velocities in body frame whereas p_dot, q_dot and r_dot are its derivate (derivative = rate of change) aka angular acceleration in body frame. x_doubledot, y_doubledot, z_doubledot represents linear accelerations in World Frame.
  8. Linear Acceleration
    As briefed before, whenever propellers are moving, drone will start moving (accelerating) in x, y and z direction depending upon total thrust generated by it’s 4 propellers (represented by Ftotal in below equation) and drone’s orientation (represented by Rotation Matrix R). We know force = mass * acceleration. Ignoring Rotation Matrix R, if we just consider acceleration in z direction, it would be given by
    Z_force = (mass * gravitational force) — (mass * Z_acceleration)
    mass * Z_acceleration = mass * gravitational force — Z_force
    And therefore Z_acceleration = gravitational force — Z_force / mass
    Extrapolating it to x and y direction and including Rotation Matrix (for reasons described in section 4 and 6), equation describing linear acceleration for a drone is given by below equation, where m is mass of drone and g is for gravitational force. Negative sign in F indicates that we are considering gravitational force to be in positive z direction.
  9. Angular acceleration
    Along with linear motion, owing to rotating propellers and its orientation, drone will also have some rotational motion. . While it is convenient to have the linear equations of motion in the inertial / world frame, the rotational equations of motion are useful to us in the body frame, so that we can express rotations about the center of the quadcopter instead of about our inertial center. As mentioned in section 4, we will use drone’s IMU to get its angular accelerations. Let’s consider output from IMU be p, q and r, representing rotational velocities around drone’s x, y and z body axis.

    We derive the rotational equations of motion from Euler’s equations for rigid body dynamics. Expressed in vector form, Euler’s equations are written as

    where ω = [p, q, r]^T is the angular velocity vector, I is the inertia matrix, and moment is a vector of external moment / torques developed in section 2. . Please don’t get confused with usage of ω (as angular velocity) in this section with it’s usage as propeller’s rotation rate. We will stick to usage of ω as rotation rate post this section. We can rewrite above equation as

    Replacing ω with [p, q, r]^T, expanding moment vector and reshuffling above equation we get angular accelerations in body frame as

  10. Rate of change of the Euler angles
    Although drone’s orientation is originally observed in Body frame, we need to convert them to World Frame. Again, we use rotation matrix as per below formula for this purpose. The derivation of this formula is little elongated and is provided in the Reference [6]
  11. Recap
    So to recap what we have learnt so far
    1. A quadcopter has 4 (2 CW and 2 CCW) rotating propellers
    2. Each Propeller creates F =K_f * ω² force in direction perpendicular to its plane and Moment M = K_m * ω² around it’s perpendicular axis.
    3. A drone can be in any x, y, z position and theta (θ), phi (φ) and psi (Ψ) orientation.
    4. When a drone wants to move in z direction (in World Frame) it needs to generate appropriate force (total thrust divided by 4) on each propeller. When it wants to move in either x or y direction (again World Frame), it makes respecting theta / phi angle along with generating required force
    5. When tracking drone’s motion, we need to handle data in World and Body Frames
    6. To convert angular data from Body Frame to World Frame, a Rotational Matrix is used
    7. To track drone’s movements, we keep track of its state vector X and its derivative X_dot
    8. Rotating propellers generate linear accelerations in x, y and z direction as per equation shown in section 8
    9. Rotating propellers generate angular accelerations around z, y and z axis in Body frame as per equation shown in section 9
    10. We convert angular velocities in Body Frame to World Frame Euler angle velocities as per equation shown in section 10.
  12. References
    I don’t want to just list down references, but instead would like to sincerely thank individual authors for their work, without which this article and the understanding which I have gained for drone dynamics would have been almost impossible.
    1. System Identification of the Crazyflie 2.0 Nano Quadrocopter by Julian Forster — http://mikehamer.info/assets/papers/Crazyflie%20Modelling.pdf
    2. Trajectory Generation and Control for Quadrotors by Daniel Warren Mellinger — https://repository.upenn.edu/cgi/viewcontent.cgi?article=1705&context=edissertations
    3. Quadcopter Dynamics, Simulation, and Control by Andrew Gibiansky — http://andrew.gibiansky.com/downloads/pdf/Quadcopter%20Dynamics,%20Simulation,%20and%20Control.pdf
    4. A short derivation to basic rotation around the x-, y- or z-axis — http://www.sunshine2k.de/articles/RotationDerivation.pdf
    5. How do you derive the rotation matrices? — Quora — https://www.quora.com/How-do-you-derive-the-rotation-matrices
    6. Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors by James Diebel — https://www.astro.rug.nl/software/kapteyn/_downloads/attitude.pdf

I would like to sincerely thanks Bitcraze team for allowing me to express myself on their platform. If you liked this post, Follow, Like, Retweet it on Twitter, it will act as encouragement for writing new posts as I continue my journey in becoming a complete Drone engineer.

Till next time….cheers!!

In August we got invited by Marion from ETH Zurich to help out with this years PolyHack, that is organized by Telejob, and which theme was about drones. We really like this kind of events but our reality is that we normally don’t have enough time to participate. For this occasion though we had the opportunity to both have fun and see how our products work when used during an event like this. Two birds with one stone and the decision was made.  Together with one of the main sponsors ELCA, we organized the flying postman challenge:

Drones seem to be the future of post deliveries, but how is it going to work? Join us to reproduce a swarm of drones delivering parcels through a city to have a glimpse at this future!

The challenge the teams got was to deliver as many parcels within 5min in a miniature city, 4m x 4m, using Crazyflies. Since the Crazyflies can’t carry that much payload the parcels was just digital/imaginary but had to be picked up at a pick-up zone. They were allowed to use up to thee Crazyflies simultaneous to increase capacity. For more details checkout the challenge description.

To manage the challenge ELCA developed the CrazyServ which uses a REST API to control Crazyflies, wrapping the high level position commander, and to pick-up parcels. One nice benefit with a server is that it can keep track of which parcels has been picked up and been delivered making the scoring fully automatic.

Bitcraze part in the challenge was to bring drones, technical support and our loco positioning system to make up the 4m x 4m city. Or actually three of them, as there were going to be six teams competing for the victory. The initial information was that the three systems would be installed in separated rooms, far away, but we ended up having them side by side. That left us with some live-hacking, changing from TDoA-2 to TDoA-3 so the anchors would not interfere with each other. We ended up using 12 anchors in total which gave enough precision for the PolyHackers to complete their challenge.

The PolyHack was a success and we had a great time. The winning team in our challenge, Electek Innovation, managed to deliver 19 parcels during the 5min with the use of a “loop” system. Congrats and well done! If you get inspired by this hackaton the CrazyServ is available on github! Together with a e.g. swarm bundle it shouldn’t be to hard to reproduce.

Thanks Telejob for letting us take part of this great event!

 

We have a collaboration with Qualisys, a Swedish manufacturer of top of the line motion capture systems. Similar to us they are a passionate about what they do, are working on high tech products and to make it even better, they are located in Gothenburg, just a couple of hours away by train. If you are not familiar with motion capture systems, it is a system that can track objects with reflective markers in space using high resolution cameras. The precision/accuracy is very good (sub millimeter) and can be used to track more or less anything such as the movements of a human body or the position of a robot, for instance a Crazyflie. The position of a Crazyflie is calculated by the MoCap system and by sending it to the drone via radio, it can fly autonomously.

Qualisys

We are super happy of getting the opportunity to work with MoCap systems and making it an integral part of the Bitcraze eco system. We have already added support in Crazyswarm for the Qualisys system and soon there will be a tab in the Crazyflie python client for basic autonomous flight using a Qualisys system. We will release a passive MoCap deck in the near future that will make it easy to attach reflective markers to a Crazyflie in a well known configuration, see this blog post for more information. Further more we are looking at making an active marker deck that utilizes Qualisys’ active marker technology to both position and identify an object at the same time.

Recently we spent a day in the large lab of Qualisys. We played with the LPS system in a larger set up and experimented with passive MoCap deck configurations and finally tried to fly a swarm.

Martin and Tobias configuring MoCap decks

Unfortunately we ran out of time and we tried to push the envelop a bit too far so we never managed to fly the full sequence without crashes, on the other hand, getting that close in a couple of hours is not too bad. Even though the full swarm did not work out we learned new things and had a lot of fun. Thanks Martin and everyone at Qualisys!

 

If you are looking for a motion capture system and want more information about Qualisys, please do not hesitate to contact us or Qualisys.

Last week we have been focusing on making a release for nearly all our firmware and software. This was done mainly to support the new products we will release this fall but it also contains a lot of other functionality that have been added since the previous release. In this blog-post we will describe the most important features of this release.

New Loco Positioning status and configuration tab

New deck support

The Crazyflie firmware and Crazyflie client 2018.10 adds support for a range of new decks that are about to be released:

  • Flow deck V2 and Z-Ranger V2: New versions of the flow and Z-Ranger deck that uses the new VL53L1 distance sensor. Drivers are implemented in the Crazyflie firmware and the client has been updated to allow flying up to 2 meter in height hold and hover modes when the new decks are detected.
  • Multiranger deck: Diver for the new Multiranger deck is implemented in the Crazyflie firmware, support code is now present in the lib as well as an example implementing the push demo that makes the Crazyflie fly in hover mode using the flow deck and move away from obstacles:

The Flow deck V2 is already available in our webstore. The Z-Ranger V2 and Multiranger will be available in the following weeks, stay tuned on the blog for updated information.

Crazyswarm support

During the year, functionality implemented for the Crazyswarm project has been merged back to the Crazyflie firmware master branch. Practically it means that the Crazyflie firmware 2018.10 is the first stable version to support Crazyswarm. The main features implemented by Crazyswarm are:

  • Modular controller and estimator framework that allows to switch the estimator or the controller at runtime. Practically it means that it is not required to recompile the firmware to use a different controller anymore.
  • Addition of a high-level commander that is able to generate setpoints for the controller from within the Crazyflie. The high-level commander is usable both from Crazyswarm and from the Crazyflie python library. It currently has commands to take-off, land, go to a setpoint and follow a polynomial trajectory. It is made in such a way that it can be extended in the future.
  • Addition of the Mellinger controller: a new controller that allows to fly much tighter and precise trajectories than the PID controller. It is tuned pretty tight so it is currently mostly usable using a motion capture or lighthouse as positioning and togeather with the high-level commander.

Improved and more stable Loco Positioning System

A lot of work has been put in the Loco Positioning System (LPS) this summer. The result of this work is the creation of a new ranging mode: TDoA3. TDoA3 allows to fly as many Crazyflie as we want in the system and to add as many anchors are needed, see our previous blog-post for more information. With this release TDoA 3 is added as a stable ranging mode for LPS. The added features related to LPS are:

  • Added TDoA3 as a ranging mode in the LPS-Node-firmware, the Crazyflie 2.0 firmware and the Crazyflie client
  • Revampted the Crazyflie client LPS tab and communication protocol to handle more than 8 anchors
  • Implementation of a new outlier detector for TDoA2 and TDoA3 that drastically improve positioning noise and flight quality

Release notes and downloads

As usual the release build and release note is available on Github. The Crazyflie client and lib are also available as python pip package as cfclient and cflib.