Category: Crazyflie

We’re happy to announce that the Multiranger and the STEM ranging bundle are now available! The Multiranger deck gives lots of exciting new possibilities when it comes to navigation and classroom activities. One of the features is that you can work with the Crazyflie more without getting into the hardcore control algorithms. Some ideas we’ve had are:

  • Working on algorithms for autonomously driving obstacle courses
  • Scanning rooms and environments and mapping them out (like below)
  • Creating fun applications like airhocky or ping ping where you can play around with the Crazyflie

We’re still working on a nice video for presenting the product (like the STEM bundle video) but until it’s finished here’s a screenshot of using the STEM ranging bundle to map out a small course.

If you want to try out some of the Multiranger deck demos they are available in the example directory of the crazyflie-lib-python project (note they require the Flow deck as well):

  • multiranger_push.py: When the application in launched the Crazyflie will take off and hover. If anything is getting close to the right/left/front/back sensors the Crazyflie will move in the opposite direction. 
  • multiranger_pointcloud.py: When the application is launched the Crazyflie will take off, hover and a 3D-plot will be shown of what is detected by the Multiranger deck sensors. By default the left/right/front/back/up sensors will be plotted, but you can also add the Crazyflie position and the down sensor if you like. The Crazyflie can be moved around by using the arrow keys on the keyboard and w/s for up/down and a/d for rotating CCW/CW. For more info see the documentation in the example.

If you have any other ideas that might be cool to try, make sure to leave them in the comments below!

This week we have a guest blog post from Percy Jaiswal about quad rotor dynamics. Enjoy!

  1. Components
    Although most of us are aware how a quadcopter / drone looks, a generic picture (It’s of a drone called Crazyflie from Bitcraze) of drone is shown above. It consists of 4 motors, control circuitry in middle and Propellers mounted on its rotors. For reasons described in below section, 2 of the rotors rotate in clockwise (CW) direction and remaining 2 in counterclockwise (CCW). CW and CCW motors are placed next to each other to negate Moment (described in next section) generated by them. Propellers come in different configurations like CW or CCW rotating, Pusher or Tractor, with different radius, pitch etcetera.
  2. Force and Moments

    Each rotating propeller produces two kind of forces. When a rotor rotates, it’s propeller produces upward thrust given by F=K_f * ω² (shown by forces F1, F2, F3 and F4 in Figure 2) where ω (omega) is rotation rate of rotor measured in radian / second. Constant K_f depends upon many factors like torque proportionality constant, back-EMF, Density of surrounding air, area swept by propeller etc. The values for K_f​ and K_m (mentioned below)​ are generally found empirically. We mount the motor and propeller on a load cell and measure the force and moment for different motor speeds. Refer “System Identification of the Crazyflie 2.0 Nano Quadrocopter” by Julian Forster for details regarding measurement of K_f and K_m.
    Total upward thrust generated by all 4 propellers is given by summing all individual thrusts generated, for i= 1 to 4 its given by
    F_i = K_f * ω²
    Apart from upward force, a rotating propeller also generates an opposing rotating spin called Torque or Moment (shown by Moments M1, M2, M3 and M4 in Figure 2). For e.g. a rotor spinning in CW direction will produce a torque which causes the body of drone to spin in CCW direction.  A demonstration of this effect can be seen here. This rotating torque is given by M=K_* ​ω²
    Moment generated by a motor is in opposite direction to its spinning, hence CW and CCW spinning motors generate opposite moments. And this is the reason why we have CW and CCW rotating motors so that in steady hover state, moments from 2 CW and 2 CCW rotating rotors negate each other out and drone doesn’t keeping spinning about its body axis (also called yaw).
    Moments / Torques M1, M2, M3 and M4 are moments generated by individual motors. The overall Moment generated around drone’s z axis (Z_b in Figure 2) is given by summation of all 4 moments. Remember that CW and CCW moments will have opposite signs.
    moment_z = M1 + M2 + M3 + M4, again CW and CCW moments will have opposite signs and hence in ideal condition (or whenever we don’t want any Yaw (rotation around z axis) movement) moment_z will be close to 0.
    Contrary to moment_z, overall moment / torque generated around x and y axis’s calculations are little different. Looking at Figure 2, we can see that motor 1 and 3 lie on x axis of drone. So they won’t contribute to any moment / torque around x axis. However we can see that difference in forces generated by motor 2 and 4 will cause drone’s body to tilt around it’s x axis and this is what constitutes overall moment / torque around x axis, which is given by
    moment _x = (F2 — F4) * L, where L is the distance from the axis of rotation of the rotors to the center of the quadrotor. By same logic,
    moment _y = (F3 — F1) * L.
    Summing it up, moment around all 3 axis can be denoted by below vector
    moment = [moment_x, moment_y, moment_z]^T (^T for Transpose)

  3. Orientation and position

    A drone has positional as well as orientational attributes, meaning to say it can be any position (x, y, z coordinates) and can be making certain angles (theta (θ), phi (φ) and psi (ψ)) with respect to world / Inertial frame. Above figure shows theta (θ), phi (φ) and psi (ψ) more clearly.
  4. Moving in z and x & y direction

    Whenever a drone is stationary, it’s in alignment with World frame, meaning to say its Z axis is in same direction as World’s gravitational field. During such a case, if a drone wants to move upwards, it just needs to set proper propeller rotating speed and it can start moving in z direction according to equation total generated force — gravity. However, if it wants to move in x or y direction it first needs to orient itself (making required theta or phi angle). When that happens, total thrust generated by four propellers (F_thrust) has a component in z direction and in x/y direction as shown in above 2D figure. For above shown example, using basic trigonometry, we can find z and y directional force by following equation, where phi is angle made Drone’s body z axis with world Frame.
    F_y ​= F_thrust * ​sinϕ
    F_z ​= F_thrust * ​cosϕ
  5. World and Body frame

    To measure above stated theta, phi and psi angles, usually drone’s onboard IMU sensor is used. This sensor measures how fast drone’s body is rotating around its body frame and provides that angular velocity as its output. When processing this IMU outputs, we need to be careful and understand that angular velocities sent by it are not with respect to World frame, but are with respect to its Body Frame. Above diagram shown both this frames for reference.
  6. Rotation Matrix

    To convert coordinates from Body Frame to World Frame and vice versa, we use a 3×3 matrix called Rotation Matrix. That is, if V is a vector in the world coordinates and V’ is the same vector expressed in the body-fixed coordinates, then the following relations hold:
    V’ = R * V and
    V = R^T * V’ where R is Rotation Matrix and R^T is its transpose.
    To understand this relation completely, let’s begin by understanding rotation in 2D. Let the vector V be rotated by an angle β to get the new vector V′. Let r=|V|. Then, we have the below relations:
    vx = r*cosα and vy = r*sinα
    v’x = r*cos(α+β) and v’y = r*sin(α+β). Expanding this, we get
    v’x = r * (cosα * cosβ — sinα * sinβ) and v’y = r * (sinα * cosβ + cosα * sinβ)
    v’x = vx * cosβ — vy * sinβ and v’y = vy * cosβ + vx * sinβ
    This is exactly what we want because the desired point V’ is described in terms of the original point V and the actual angle β. For conclusion we can write this in matrix notation as

    Going from 2D to 3D is relatively simple in Rotation Matrix’s case. In fact, the 2D matrix we just now derived can actual be thought of as an 3D rotation matrix for rotation around z axis. Hence, for a rotation around z-axis the Rotation Matrix would be

    0, 0, 1 in values in last row and column indicate that z coordinates for rotated point (v’z) is same as original point’s z coordinate (vz). We will call this Z axis Rotation Matrix as Rz(β). Extrapolating same logic to rotations around x and y axis, we can get values for RX(β) and RY(β) as

    And final value for 3D motion Rotation Matrix will just be cross multiplication of above three Rotation Matrices.
    R = Rz(ψ) x Ry(θ) x Rx(φ), where psi (ψ), phi (φ,) and theta (θ) are rotation around z, y, and x axis respectively.

  7. State Vector and its Derivative
    As our drone has 6 degrees of freedom, we usually track it by monitoring this six parameters along with their derivatives (how they are changing with time) to get an accurate estimate of drone’s position and velocity of movement. We do this by maintaining what is often called a state vector X = [x, y, z, φ, θ, ψ, x_dot, y_dot, z_dot, p, q, r] and its derivative X_dot= [x_dot, y_dot, z_dot, θ_dot, φ_dot, ψ_dot, x_doubledot, y_doubledot, z_doubledot, p_dot, q_dot, r_dot] where x, y and z are position of drone in World frame, x_dot, y _dot, and z_dot are positional / linear velocities in World Frame. φ, θ, ψ represent drone attitude / orientation in World frame whereas φ_dot, θ_dot, ψ_dot represents rate of change of this (Euler) angles. p, q, r are angular velocities in body frame whereas p_dot, q_dot and r_dot are its derivate (derivative = rate of change) aka angular acceleration in body frame. x_doubledot, y_doubledot, z_doubledot represents linear accelerations in World Frame.
  8. Linear Acceleration
    As briefed before, whenever propellers are moving, drone will start moving (accelerating) in x, y and z direction depending upon total thrust generated by it’s 4 propellers (represented by Ftotal in below equation) and drone’s orientation (represented by Rotation Matrix R). We know force = mass * acceleration. Ignoring Rotation Matrix R, if we just consider acceleration in z direction, it would be given by
    Z_force = (mass * gravitational force) — (mass * Z_acceleration)
    mass * Z_acceleration = mass * gravitational force — Z_force
    And therefore Z_acceleration = gravitational force — Z_force / mass
    Extrapolating it to x and y direction and including Rotation Matrix (for reasons described in section 4 and 6), equation describing linear acceleration for a drone is given by below equation, where m is mass of drone and g is for gravitational force. Negative sign in F indicates that we are considering gravitational force to be in positive z direction.
  9. Angular acceleration
    Along with linear motion, owing to rotating propellers and its orientation, drone will also have some rotational motion. . While it is convenient to have the linear equations of motion in the inertial / world frame, the rotational equations of motion are useful to us in the body frame, so that we can express rotations about the center of the quadcopter instead of about our inertial center. As mentioned in section 4, we will use drone’s IMU to get its angular accelerations. Let’s consider output from IMU be p, q and r, representing rotational velocities around drone’s x, y and z body axis.

    We derive the rotational equations of motion from Euler’s equations for rigid body dynamics. Expressed in vector form, Euler’s equations are written as

    where ω = [p, q, r]^T is the angular velocity vector, I is the inertia matrix, and moment is a vector of external moment / torques developed in section 2. . Please don’t get confused with usage of ω (as angular velocity) in this section with it’s usage as propeller’s rotation rate. We will stick to usage of ω as rotation rate post this section. We can rewrite above equation as

    Replacing ω with [p, q, r]^T, expanding moment vector and reshuffling above equation we get angular accelerations in body frame as

  10. Rate of change of the Euler angles
    Although drone’s orientation is originally observed in Body frame, we need to convert them to World Frame. Again, we use rotation matrix as per below formula for this purpose. The derivation of this formula is little elongated and is provided in the Reference [6]
  11. Recap
    So to recap what we have learnt so far
    1. A quadcopter has 4 (2 CW and 2 CCW) rotating propellers
    2. Each Propeller creates F =K_f * ω² force in direction perpendicular to its plane and Moment M = K_m * ω² around it’s perpendicular axis.
    3. A drone can be in any x, y, z position and theta (θ), phi (φ) and psi (Ψ) orientation.
    4. When a drone wants to move in z direction (in World Frame) it needs to generate appropriate force (total thrust divided by 4) on each propeller. When it wants to move in either x or y direction (again World Frame), it makes respecting theta / phi angle along with generating required force
    5. When tracking drone’s motion, we need to handle data in World and Body Frames
    6. To convert angular data from Body Frame to World Frame, a Rotational Matrix is used
    7. To track drone’s movements, we keep track of its state vector X and its derivative X_dot
    8. Rotating propellers generate linear accelerations in x, y and z direction as per equation shown in section 8
    9. Rotating propellers generate angular accelerations around z, y and z axis in Body frame as per equation shown in section 9
    10. We convert angular velocities in Body Frame to World Frame Euler angle velocities as per equation shown in section 10.
  12. References
    I don’t want to just list down references, but instead would like to sincerely thank individual authors for their work, without which this article and the understanding which I have gained for drone dynamics would have been almost impossible.
    1. System Identification of the Crazyflie 2.0 Nano Quadrocopter by Julian Forster — http://mikehamer.info/assets/papers/Crazyflie%20Modelling.pdf
    2. Trajectory Generation and Control for Quadrotors by Daniel Warren Mellinger — https://repository.upenn.edu/cgi/viewcontent.cgi?article=1705&context=edissertations
    3. Quadcopter Dynamics, Simulation, and Control by Andrew Gibiansky — http://andrew.gibiansky.com/downloads/pdf/Quadcopter%20Dynamics,%20Simulation,%20and%20Control.pdf
    4. A short derivation to basic rotation around the x-, y- or z-axis — http://www.sunshine2k.de/articles/RotationDerivation.pdf
    5. How do you derive the rotation matrices? — Quora — https://www.quora.com/How-do-you-derive-the-rotation-matrices
    6. Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors by James Diebel — https://www.astro.rug.nl/software/kapteyn/_downloads/attitude.pdf

I would like to sincerely thanks Bitcraze team for allowing me to express myself on their platform. If you liked this post, Follow, Like, Retweet it on Twitter, it will act as encouragement for writing new posts as I continue my journey in becoming a complete Drone engineer.

Till next time….cheers!!

In August we got invited by Marion from ETH Zurich to help out with this years PolyHack, that is organized by Telejob, and which theme was about drones. We really like this kind of events but our reality is that we normally don’t have enough time to participate. For this occasion though we had the opportunity to both have fun and see how our products work when used during an event like this. Two birds with one stone and the decision was made.  Together with one of the main sponsors ELCA, we organized the flying postman challenge:

Drones seem to be the future of post deliveries, but how is it going to work? Join us to reproduce a swarm of drones delivering parcels through a city to have a glimpse at this future!

The challenge the teams got was to deliver as many parcels within 5min in a miniature city, 4m x 4m, using Crazyflies. Since the Crazyflies can’t carry that much payload the parcels was just digital/imaginary but had to be picked up at a pick-up zone. They were allowed to use up to thee Crazyflies simultaneous to increase capacity. For more details checkout the challenge description.

To manage the challenge ELCA developed the CrazyServ which uses a REST API to control Crazyflies, wrapping the high level position commander, and to pick-up parcels. One nice benefit with a server is that it can keep track of which parcels has been picked up and been delivered making the scoring fully automatic.

Bitcraze part in the challenge was to bring drones, technical support and our loco positioning system to make up the 4m x 4m city. Or actually three of them, as there were going to be six teams competing for the victory. The initial information was that the three systems would be installed in separated rooms, far away, but we ended up having them side by side. That left us with some live-hacking, changing from TDoA-2 to TDoA-3 so the anchors would not interfere with each other. We ended up using 12 anchors in total which gave enough precision for the PolyHackers to complete their challenge.

The PolyHack was a success and we had a great time. The winning team in our challenge, Electek Innovation, managed to deliver 19 parcels during the 5min with the use of a “loop” system. Congrats and well done! If you get inspired by this hackaton the CrazyServ is available on github! Together with a e.g. swarm bundle it shouldn’t be to hard to reproduce.

Thanks Telejob for letting us take part of this great event!

 

We have a collaboration with Qualisys, a Swedish manufacturer of top of the line motion capture systems. Similar to us they are a passionate about what they do, are working on high tech products and to make it even better, they are located in Gothenburg, just a couple of hours away by train. If you are not familiar with motion capture systems, it is a system that can track objects with reflective markers in space using high resolution cameras. The precision/accuracy is very good (sub millimeter) and can be used to track more or less anything such as the movements of a human body or the position of a robot, for instance a Crazyflie. The position of a Crazyflie is calculated by the MoCap system and by sending it to the drone via radio, it can fly autonomously.

Qualisys

We are super happy of getting the opportunity to work with MoCap systems and making it an integral part of the Bitcraze eco system. We have already added support in Crazyswarm for the Qualisys system and soon there will be a tab in the Crazyflie python client for basic autonomous flight using a Qualisys system. We will release a passive MoCap deck in the near future that will make it easy to attach reflective markers to a Crazyflie in a well known configuration, see this blog post for more information. Further more we are looking at making an active marker deck that utilizes Qualisys’ active marker technology to both position and identify an object at the same time.

Recently we spent a day in the large lab of Qualisys. We played with the LPS system in a larger set up and experimented with passive MoCap deck configurations and finally tried to fly a swarm.

Martin and Tobias configuring MoCap decks

Unfortunately we ran out of time and we tried to push the envelop a bit too far so we never managed to fly the full sequence without crashes, on the other hand, getting that close in a couple of hours is not too bad. Even though the full swarm did not work out we learned new things and had a lot of fun. Thanks Martin and everyone at Qualisys!

 

If you are looking for a motion capture system and want more information about Qualisys, please do not hesitate to contact us or Qualisys.

Last week we have been focusing on making a release for nearly all our firmware and software. This was done mainly to support the new products we will release this fall but it also contains a lot of other functionality that have been added since the previous release. In this blog-post we will describe the most important features of this release.

New Loco Positioning status and configuration tab

New deck support

The Crazyflie firmware and Crazyflie client 2018.10 adds support for a range of new decks that are about to be released:

  • Flow deck V2 and Z-Ranger V2: New versions of the flow and Z-Ranger deck that uses the new VL53L1 distance sensor. Drivers are implemented in the Crazyflie firmware and the client has been updated to allow flying up to 2 meter in height hold and hover modes when the new decks are detected.
  • Multiranger deck: Diver for the new Multiranger deck is implemented in the Crazyflie firmware, support code is now present in the lib as well as an example implementing the push demo that makes the Crazyflie fly in hover mode using the flow deck and move away from obstacles:

The Flow deck V2 is already available in our webstore. The Z-Ranger V2 and Multiranger will be available in the following weeks, stay tuned on the blog for updated information.

Crazyswarm support

During the year, functionality implemented for the Crazyswarm project has been merged back to the Crazyflie firmware master branch. Practically it means that the Crazyflie firmware 2018.10 is the first stable version to support Crazyswarm. The main features implemented by Crazyswarm are:

  • Modular controller and estimator framework that allows to switch the estimator or the controller at runtime. Practically it means that it is not required to recompile the firmware to use a different controller anymore.
  • Addition of a high-level commander that is able to generate setpoints for the controller from within the Crazyflie. The high-level commander is usable both from Crazyswarm and from the Crazyflie python library. It currently has commands to take-off, land, go to a setpoint and follow a polynomial trajectory. It is made in such a way that it can be extended in the future.
  • Addition of the Mellinger controller: a new controller that allows to fly much tighter and precise trajectories than the PID controller. It is tuned pretty tight so it is currently mostly usable using a motion capture or lighthouse as positioning and togeather with the high-level commander.

Improved and more stable Loco Positioning System

A lot of work has been put in the Loco Positioning System (LPS) this summer. The result of this work is the creation of a new ranging mode: TDoA3. TDoA3 allows to fly as many Crazyflie as we want in the system and to add as many anchors are needed, see our previous blog-post for more information. With this release TDoA 3 is added as a stable ranging mode for LPS. The added features related to LPS are:

  • Added TDoA3 as a ranging mode in the LPS-Node-firmware, the Crazyflie 2.0 firmware and the Crazyflie client
  • Revampted the Crazyflie client LPS tab and communication protocol to handle more than 8 anchors
  • Implementation of a new outlier detector for TDoA2 and TDoA3 that drastically improve positioning noise and flight quality

Release notes and downloads

As usual the release build and release note is available on Github. The Crazyflie client and lib are also available as python pip package as cfclient and cflib.

In this blog post we will describe one of the demos we were running at IROS and how it was implemented. Conceptually this demo is based on the same ideas as for ICRA 2017 but the implementation is completely new and much cleaner.

The demo is fully autonomous (no computer in the loop) but it requires an external positioning system. We flew it using either the Loco Positioning System or the prototype Lighthouse system.
A button has been added to the LPS deck to start the demo. When the button is pressed the Crazyflie waits for position lock, takes off and repeats a predefined spiral trajectory until the battery is out, when it goes back to the door of the cage and lands.
For some reason we forgot to shoot a video at IROS so a reproduced version from the (messy) office will have to do instead, imagine a 2×2 m net cage around the Crayzflie.

Implementation

As mentioned in an earlier blog post the demo uses the high level commander originally developed by Wolfgang Hoenig and James Alan Preiss for Crazyswarm. We prototyped everything in python (sending commands to the Crazyflie via Crazyradio) to quickly get started and design the demo . Designing trajectories for the high level commander is not trivial and it took some time to get it right. What we wanted was a spiral downwards motion and then going back up along the Z-axis in the centre of the spiral. The high level commander is a bit picky on discontinuities and we used sines for height and radius to generate a smooth trajectory. 

Trajectories in the high level commander are defined as a number of pieces, each describing x, y, z and yaw for a short part of the full trajectory. When flying the trajectories the pieces are traversed one after the other. Each piece is described by 4 polynomials with 8 terms, one polynomial per x, y, z and yaw. The tricky part is to find the polynomials and we decided to do it by cutting our trajectory up in segments (4 per revolution), generate coordinates for a number of points along the segment and finally use numpy.polyfit() to fit polynomials to the points. 

When we were happy with the trajectory it was time to move it to the Crazyflie. Everything is implemented in the app.c file and is essentially a timer loop with a state machine issuing the same commands that we did from python (such as take off, goto and start trajectory). A number of functions in the firmware had to be exposed globally for this to work, maybe not correct from an architectural point of view but one has to do what one has to do to get the demo running :-) The full source code is available at github. Note that the make file is hardcoded for the Crazyflie 2.1, if you want to play with the code on a CF 2.0 you have to update the sensor setting

This approach led to an idea of a possible future app API (for apps running in the Crazyflie) containing similar functionality as the python lib. This would make it easy to prototype an app in python and then port it to firmware.

Controllers

The standard PID controller is very forgiving and usually handles noise and outliers from the positioning system in a fairly good way. We used it with the LPS system since there is some noise in the estimated position in an Ultra Wide Band system. The Lighthouse system on the other hand is much more precise so we switched to the Mellinger controller instead when using it. The Mellinger controller is more agile but also more sensitive to position errors and tend to flip when something unexpected happens. It is possible to use the Mellinger with the LPS as well but the probability of a crash was higher and we prioritised a carefree demo over agility. An extra bonus with the Mellinger controller is that it also handles yaw (as opposed to the PID controller) and we added this when flying with the Lighthouse. 

Going faster

Since the precision in the Lighthouse positioning system is so much better we increased the speed to add some extra excitement. It turned out to be so good that it repeatedly almost touched the panels at the back without any problems, over and over again!

One of the reasons we designed the trajectory the way we did was actually to make it possible to fly multiple copters at the same time, the trajectories never cross. As long as the Crazyflies are not hit by downwash from a copter too close above all is good. Since the demo is fully autonomous and the copters have no knowledge about each other we simply started them with appropriate intervals to separate them in space. We managed to fly three Crazyflies simultaneously with a fairly high degree of stability this way.

Last week half of Bitcraze, Kristoffer, Tobias and Arnaud were at IROS 2018 where we had an exhibitor booth. We have had a great week and met so many interesting and inspiring people, both users of the Crazyflie as well as persons curious in what we do. Thanks to everyone that passed by the booth, it is awesome to hear how Crazyflie is used and how we can improve it even more.

This year we invited Qualisys to share the booth with us, they kindly provided a motion capture system and we had the pleasure to be joined by Martin to help us and present Qualisys.

Demo-wise we had prepared a bunch of demos which you can read about in our previous post about IROS. It won’t surprise anyone to hear that not everything has been working as planned. The Lighthouse demo did not work when we set it up in the booth (it did in the office!) but some live hacking solved the problem on Tuesday. We also had unexpected issues with the Crazyswarm demo: our landing pad design and flight trajectory was working very well in the office, but in the booth we experienced much more instabilities that prevented us to successfully fly and land all 6 crazyflies in Crazyswarm. We still need to investigate what happened. The autonomous demos, both using the UWB Loco Positioning System and Lighthouse (when fixed), have been surprisingly robust: they do not require a connection to a computer and they worked almost all the time, when they failed they failed without drama and could be reset very quickly.

Overall we have been able to accumulate flight time and experience much quicker in this last week than in the last months, now we have a lot of things to test and improve and also a lot of things we can be much more confident about. We have been fixing and improving the demo during the event and we will write more blog posts in the coming weeks about things we have developed and improved for and during IROS.

To conclude, thanks again to everyone that dropped by the booth, this kind of event always make us come back with a boost of motivation and fresh new ideas and it is all thanks to you!

The last couple of weeks has been really intense since we’ve been busy preparing for IROS. Finally it’s here, and with it we’re releasing a few new products!

We’re excited to announce that during the fall we will be releasing the following new products:

  • Crazyflie 2.1: The Crazyflie 2.0 was released almost 4 years ago now. Over the years there’s been thousands of users and lots of feedback on the product. Most of it great, but there’s been a few things we’ve wanted to fix. Now with the updated 2.1 version we finally have the chance to do it. Here’s a quick list of the updates:
    • Better radio performance and external antenna support: With a new radio power amplifier we’ve improved the link quality and added support for dual antennas (on-board chip antenna and external antenna via u.FL connector)
    • Better power button: We’ve gotten feedback that the power button breaks too easily, so now we’ve replaced with a more solid alternative.
    • Improved battery cable fastening: To avoid weakening of the cables over time they are now run through a cable relief.
    • Improved sensors: To make the flight performance better we’ve switched out the IMU and pressure sensor. The new Crazyflie uses the drone specialized sensor combo BMI088 and BMP388 by Bosch Sensortech.
  • Flow deck v2: The Flow deck has been upgraded with the new ST VL53L1x which increases the range up to 4 meters
  • Z-ranger deck v2: The Z-ranger deck has been upgraded with the new ST VL53L1x which increases the range up to 4 meters
  • Multi-ranger deck: Finally the Multi-ranger deck is currently in production and will be available during the fall!
  • Mocap deck: The motion capture deck with support for easily attaching markers
  • “Roadrunner” (alpha): With TDoA3 to be included in the next firmware release we’re happy to release one of our LPS tags code named “Roadrunner”. The hardware is basically a Crazyflie 2.1 without motors and up to 12V input power.

In the upcoming weeks we’ll post more details about the products and when they will be available, so stay tuned!

We should also mention that we will showing off some awesome prototypes of products that are planned to be released next year, among them:

  • “RZR”: The long awaited Crazyflie + BigQuad stand-alone combo code-named “RZR” is making it’s way into production and we are aiming to release it during the beginning of 2019. Basically it’s a Crazyflie 2.1 where instead of motors you can directly connect ESCs to build bigger quads up to around 0.5kg.
  • Lighthouse deck: Our current prototype is now flying with both Lighthouse 1.0 and 2.0 and the performance is awesome! This is definitely the next product out the door after the list above and we’re aiming at having it available during the spring.
  • Raspberry Pi Zero power deck: This deck allows you to add a Raspberry Pi Zero to the Crazyflie 2.x and the “RZR”.
  • LPS tag: We’ve shown this tag before but now we’ve updated it to use the Crazyflie 2.1 IMU and to have proper mounting holes. We’re getting closer to release and this will hopefully be available during the spring.

During IROS this week we will be showing off all the products above (including the prototypes). So if you want to be one of the first to check them out drop by our booth nr 91.

We are working hard in the Bitcraze team to prepare and get ready for IROS 2018 in Madrid next week. As usual preparing for fairs and exhibitions make us add useful features and functionality that we might not had planned to implement but that we find useful or need. Even though some of it might be a bit hackish, most of it will add value to the project and will hopefully be useful to the community. Notable functionality that we are working on this time: 

  • design for a 3D-printable charging pad
  • basic support for the experimental Light House deck
  • support for the high level commander in the python lib
  • “app” for autonomous flying running in the Crazyflie

Charging pads

The plan is to fly a small crazyswarm with 6 Crazyflies using a motion capture system from Qualisys. Since we want to spend as much time as possible talking to people and minimize setup time, we were looking for a solution to automatically recharge the batteries between flights. We are planning to use Qi-charger decks for contact less charging with 3D-printed landing pads with slopes to make the Crazyflies slide into the correct charging position even if they land a few millimetres off. 

The Light House deck

Even though the Light House deck hardware still is very much experimental we have started to add support for it in the Crazyflie firmware. Hopefully we will be able to run our demos using either LPS or the Lighthouse to show the difference in performance.

Support for the high level commander in the python lib

The high level commander was contributed by Wolfgang Hoenig and James Alan Preiss (thanks!) an has been available in the Crazyflie firmware for a while. In an environment with positioning support it provides high level commands such as “take off” and “go to” as well as flying user defined trajectories and is used by Crazyswarm. We wanted to use the same functionality in our demo but running it stand alone in the firmware. The easiest way to get acquainted with the functionality was to play with it from python and as a side effect we implemented the API in the python lib for anyone to use. There is also an example script called autonomous_sequence_high_level.py in the examples directory.

App for autonomous flight

For ICRA last year we wrote code in the Crazyflie firmware to fly trajectories autonomously. At that point we simply fed setpoints to the PID controller to make the Crazyflie follow a preprogrammed path. Now we have more tools in the Crazyflie toolbox (the high level commander and the Mellinger controller) and by using them we have reduced the amount of code needed and complexity of the solution while the performance has been improved (code on github). 

As mentioned in an earlier post, this year we are going to exhibit at iROS 2018 in Madrid. Every time we go to fairs and exhibition, it is the occasion for us to work more on integration to put together the latest development into a demo we can show at the event. One of the latest development we will show at iROS is the lighthouse deck.

Work on the lighthouse deck have continued during the summer and we are now at a stage where things are starting to work quite well with Lighthouse V1 base stations. We are quite impressed by the performance: we have measured a positioning noise bellow 1mm. We are flying the Crazyflie using Crazyswarm which allows us to fly smooth trajectory using the high-level controller:

The goal for iROS is to stabilize and push the code in the main Crazyflie firmware repos. We will have a couple of Crazyflie setup with the Lighthouse deck and that we will be able to demonstrate. In the future we are also thinking of making a general purpose tag that could be used with other robots. One of the great advantage of the lighthouse tracking technology is that the position and orientation is available in the receiver, in the robot. This means that, like the LPS, the robots are autonomous and do not require an active data connection with a computer in order to locate themselves.

There is still a lot of challenges and work to be done on the deck. For once, this is currently using HTC Vive lighthouse base station V1, Valve has release the base station V2 that allows to cover much more space for each base station and to use more than 2 base stations in the same system, we plan to implement support for it. We will also need to work on multi-sensor localization and setup procedure. Currently the Crazyflie calculates its orientation using only one lighthouse receiver and requires to be in direct light of sight of both lighthouse, it is possible using more receiver to get a position and orientation with only one base station in sight which will increase the system reliablility. As for the system setup we are still using SteamVR to obtain the lighthouse positions using at least one Vive controller, the goal is eventually to be able to setup a system with the Crazyflie alone, without needing to install SteamVR. All that will most likely be discussed in more details in future post.

If you are attending iROS 2018 feel free to come and meet us at booth #91.