Category: Crazyflie

We are excited to announce that we will be having developer meetings on first Wednesdays of every month! Additionally, we are thrilled to be present in person at ICRA 2023 in London. During the same conference, there will be half day workshop called ‘The Role of Robotics Simulators for Unmanned Aerial Vehicles’ so make sure to sign-up! Please check out our newly updated event-page !

Monthly Developer meetings

We have had some online developer meetings in the past covering various topics. While these meetings may not have been the most popular, we believe it is crucial to maintain communication with the community and have interesting discussions, and exchange of ideas. However, we used to plan them ad-hoc and we had no regularity in them, which sometimes caused some of us **cough** especially me **cough**, to create confusion about the timing and location. To remove these factors of templexia (dyslexia for time), we will just have it simply on the first Wednesday of every month.

So our first one with be on Wednesday 5th of April at 15:00 CEST and the information about the particular developer meeting will be as usual on discussions. From 15:00 – 15:30 it will be a general discussion, probably with a short presentation, about a topic to be determined. From 15:30-16:00 will address regular support question from anybody that need help with their work on the Crazyflie.

ICRA 2023 London

ICRA will be held in London this year, from May 29 – June 2nd, atthe ExCel venue. We will be located in the H11 booth in the exhibitor hall, but as the date approaches, we will share more about what awesome prototypes we will showcase and what we will demonstrate on-site. Rest assured, plenty of Crazyflies will be flown in the cage! To get an idea of what we demo-ed last year it IROS Kyoto, please check out the IROS 2022 event page. Matej from Flapper Drones will join us at our booth to showcase the Flapper drone.

We are thinking of organizing a meetup for participants on the Wednesday after the Conference Dinner, so we will share the details of that in the near future as well. Also keep an eye on our ICRA 2023 event page for updated information.

ICRA Aerial Robotic Simulation Workshop

I (Kimberly) will also be present at the ‘The Role of Robotics Simulators for Unmanned Aerial Vehicles‘ workshop on Friday June 2nd. Together with Giuseppe Silano, Chiara Gabellieri and Wolfgang Hönig, we will be organizing a half day workshop focused on UAV-specific simulation in robotics. We have invited some great speakers namely: Tomáš Báča, Davide Scaramuzza, Angela Schoellig and Jaeyoung Lim. The topics will cover multi-YAV simulation to realistic vision-based rending and software-in-the-loop handling for PX4.

Additionally, participants can submit an extended abstract to be invited for an poster presentation during the same workshop. The submission deadline has been extended to April 3rd, so for more information about submission, schedule and speaker info, go to the workshop’s website.

This week’s guest blogpost is from Florian Goralsky from Bok o Bok about their dance piece with multiple Crazyflies. Enjoy!

Flying bodies across the fields is a contemporary dance piece for four performers and a swarm of drones, exploring the phenomenon of the disappearance of bees and the use of pollinating drones to compensate for this loss. The piece attempts to answer this crucial question in a poetical way: can the machine create life and save us from ecological disaster?

Novembre Numérique à l’IFCI © M studio

We’re super excited to talk about a performance that we’ve been working on for the past two years in collaboration with Bitcraze. It premiered at the Environmental Forum, Centre Pompidou Paris, in 2021, and we’ve had the opportunity to showcase it at different venues since then. We are happy to share our thoughts about it!

Choreographic research

Beyond symbolizing current attempts to use drones to pollinate fields, the presence of the Crazyflie drones, supports the back and forth between nature and technology. We integrate a swarm, performing complex choreographies, which refer to the functioning of a beehive, including the famous “bee dance”, discovered by Karl von Frisch, which is used to transmit information on the food sources. Far from having a spectacular performance as its only goal, the synchronization of autonomous drones highlights bio-inspired computer techniques, focused on collective intelligence.

© bok o bok

Challenges within a dance performance

Making a dance performance with drones needs a high accuracy and adaptability, both before and during the show. Usually, we only have a few hours, sometimes even a few minutes, to setup the system according to the space. We quickly realized we needed pre-recorded choreographies, and hybrid choreographies where the pilot could have a few degrees of freedom on pre-defined behaviors.

GUI Editor + Python Server

Taking this into account, we developed a web GUI editor, that is able to send choreographies created with any device to a Websocket Python server. The system supports any absolute positioning system (We use the Lighthouse), and then converts all the setpoints and actions to the Crazyflie API HighLevelCommander class. This system allows us to create, update, and test complex choreographies in a few minutes on various devices.

Preview position of six drones at a certain time.
Early support of the CompressedTrajectories format, with Cubic bezier curves.

What is next?

We are looking forward to developing more dancers-drones interactions in the future. It will imply, in addition to the Lighthouse system, other sensors, in order to open up new possibilities: realtime path-finding, obstacle avoidance even during a recorded choreography (to allow improvisation), etc.

Novembre Numérique à l’IFCI © M studio

We’re happy to announce that the 2023.02 release is available for download!

The main new features of this release are:

Out of tree controllers

We have made it easier to add a new controller to the firmware in the Crazyflie. Controllers can now be added in an app, the same way as an estimator can be added. The main advantage is that all the code is contained in the app which makes it easy to upgrade the underlying firmware when new releases are available. You can read about how to use this feature in the firmware repository documentation.

Support to configure ESCs with BLHeli Configurator

On brushless Crazyflies, ESCs can now be configured using the BLHeli Configurator. See PR #1170

A UKF (Unscented Kalman Filter) state estimator has been added

An Unscented Kalman Filter (UKF) estimator has been added based by Klaus Kefferpütz from the paper ‘Error-State Unscented Kalman-Filter for UAV Indoor Navigation‘. The estimator is still slightly experimental and does not yet support all positioning methods (see this issue). Because of this, it is not available by default, but you can try it by enabling it using kbuild! You can read about the UKF estimator in the repository documentation.

Platform filter in client flash dialog

A filter has been added to the bootloader dialog in the client to make it easier to find the correct release. Releases are now filtered based on platform to avoid the clutter of mixing releases for cf2, tag, bolt and flapper.

Stability and bug fixes

We have fixed several bugs in the firmware and client software that, but you can check the release notes for each of these for further details.

Release details

The following has been released:

Deprecation policy

We have created a simple deprecation policy to clarify future changes of the APIs. The short version is that we from now on will mention deprecated functionality in release notes and that the deprecated functionality will remain in the code base for 6 months before it is removed. Please see the development overview for more information.

It’s time for a new compilation video about how the Crazyflie is used in research ! The last one featured already a lot of awesome work, but a lot happened since then, both in research and at Bitcraze.

As usual, the hardest about making those videos is choosing the works we want to feature – if every cool video of the Crazyflie was in there, it would last for hours! So it’s just a selection of the most videogenic projects we’ve seen. You can find a more extensive list of our products used in research here.

We’ve seen a lot of projects that used the modularity of the Crazyflie to create awesome new features, like a catenary robot, some wall tracking or having it land upside down. The Crazyflie board was even made into a revolving wing drone. New sensors were used, to sniff out gas leaks (the Sniffy bug as seen in this blogpost), or to allow autonomous navigation. Swarms are also a research topic where we see a lot of the Crazyflie, this time for collision avoidance, or path planning. We also see more and more of simulators, which are used for huge swarms or physics tests.

Once again, we were surprised and awed by all the awesome things that the community did with the Crazyflie. Hopefully, this will inspire others to think of new things to do as well. We hope that we can continue with helping you to make your ideas fly, and don’t hesitate to share with us the awesome projects you’re working on!

Here is a list of all the research that has been included in the video:

And, without further ado, here it is:

A common task with the Crazyflie is to add a new controller or estimator. As we get some questions on how to do this, we will outline the process in this post. We will show how to add a custom controller and estimator that runs in the Crazyflie, built as an out-of-tree build.

This post assumes some basic knowledge about the Crazyflie firmware, the C programming language, how to build the firmware and flash it to a Crazyflie. If you need some more information on these topics, please see the “Getting started with development” tutorial. For an overview of how estimators and controllers are used by the stabilizer module, please see the firmware documentation.

Overview

The Crazyflie firmware is designed to make it easy to add custom controllers and estimators, a plugin system keeps the code clean and well separated. We will look at the details later, but the basic principle is to first write your new controller or estimator and then register it in the firmware. When the code has been compiled and flashed to the Crazyflie, the new module is activated by setting a parameter from the client or a python script.

We will implement the example as an app, which is a great way to make sure you can upgrade the underlying firmware without messing up your code. An app is a piece of code that exists somewhere in you file system outside of the main firmware source code. This setup minimizes the dependencies and the main firmware source tree can be upgraded without affecting your app (in most cases). That means there is no need for merges or complex management of source trees.

Registration of modules

Let’s first look at how controllers and estimators are registered and called in the plugin framework. We will use the controllers to show how it works, but the estimators are implemented in a similar way and it should be easy to understand how it works.

Note that there has been some updates of the Crazyflie firmware source code lately and any reference to the source code will be to the latest version (as of today).

The starting point of the controller implementation can be found in the src/modules/src/controller.c file, here we can find an array called controllerFunctions that holds a list of all the controllers in the system.

static ControllerFcns controllerFunctions[] = {
  {.init = 0, .test = 0, .update = 0, .name = "None"}, // Any
  {.init = controllerPidInit, .test = controllerPidTest, .update = controllerPid, .name = "PID"},
  {.init = controllerMellingerFirmwareInit, .test = controllerMellingerFirmwareTest, .update = controllerMellingerFirmware, .name = "Mellinger"},
  {.init = controllerINDIInit, .test = controllerINDITest, .update = controllerINDI, .name = "INDI"},
  {.init = controllerBrescianiniInit, .test = controllerBrescianiniTest, .update = controllerBrescianini, .name = "Brescianini"},
  #ifdef CONFIG_CONTROLLER_OOT
  {.init = controllerOutOfTreeInit, .test = controllerOutOfTreeTest, .update = controllerOutOfTree, .name = "OutOfTree"},
  #endif
};Code language: PHP (php)

We can see that there is currently four controllers in the list: the PID controller, the Mellinger controller, the INDI controller and finally the Brescianini controller. There is also an “empty” controller at the top that is not important in this context and we will simply ignore it. At the bottom we find the out-of-tree controller, we will discuss this later.

Each controller must implement three functions: an initialization function, a test function and a controller function that performs the actual controller work. Signatures for the three functions are defined in controller.c. The functions are added to the list as function pointers that can be called by the stabilizer when needed.

There is a parameter, stabilizer.controller, in the stabilizer that tells the system which controller to use in the stabilizer loop. This parameter simply contains the index in the controllerFunctions list that will be used. For example, the default value 1 will make the stabilizer loop call the controllerPid function every iteration. If the value of the stabilizer.controller parameter is changed, the initialization function for the new controller will be called and subsequent calls from the stabilizer loop will be done to the new controller function.

We will not go into details of how to implement the actual controller here, but the existing controllers can be used as examples.

There is a similar list/implementation for estimators that can be found in src/modules/src/estimator.c.

Adding a new controller

Suppose you want to add a new controller. It would be possible to add a new file in the Crazyflie firmware with your new controller implementation, add the function pointers to the list in controller.c and that would work just fine. The problem with such implementation would be that it is hard to maintain, your new files would be mixed with the files in the main firmware file tree, and even worse, you would have to modify the controller.c file to add your controller. The next time there is a new awesome feature in the firmware source code and you want to upgrade to the latest version, you will run into problems as you have to handle the files you modified!

A better solution is to use an app instead as apps are built out-of-tree, that is not in the main source tree. This removes the problem of merging changes in the main source files, all you have to do is to pull in the new file tree and recompile.

But how to register your new controller in the controller list? This is what the last line in the list of controllers is for

// ...
#ifdef CONFIG_CONTROLLER_OOT
  {.init = controllerOutOfTreeInit, .test = controllerOutOfTreeTest, .update = controllerOutOfTree, .name = "OutOfTree"},
#endif
// ...Code language: PHP (php)

If CONFIG_CONTROLLER_OOT is defined we add a controller with the three functions controllerOutOfTreeInit, controllerOutOfTreeTest and controllerOutOfTree. All you have to do in your app is to define CONFIG_CONTROLLER_OOT and make sure the functions in your controller are named like above. That’s it!

Example implementation

Now we will create a new app and add a new controller, step by step. We assume that you have a newly cloned firmware repository in your filesystem to work on.

We will show the linux flavor of commands, but it should be easy to convert to other platforms.

Create a new app

The easiest way is to start from an existing app to get started, let’s use the hello world app. Copy the app and move into the new directory

cp -r examples/app_hello_world examples/my_controller
cd examples/my_controller/

Let’s rename hello_world.c

mv src/hello_world.c src/my_controller.c

We have to tell kbuild that we renamed the file. Open src/Kbuild in your favorite editor and update it to

obj-y += my_controller.o

Now let’s fix the basics in my_controller.c, open it in your editor and change according to the comments bellow:

#include <string.h>
#include <stdint.h>
#include <stdbool.h>

#include "app.h"

#include "FreeRTOS.h"
#include "task.h"

// Edit the debug name to get nice debug prints
#define DEBUG_MODULE "MYCONTROLLER"
#include "debug.h"


// We still need an appMain() function, but we will not really use it. Just let it quietly sleep.
void appMain() {
  DEBUG_PRINT("Waiting for activation ...\n");

  while(1) {
    vTaskDelay(M2T(2000));

    // Remove the DEBUG_PRINT.
    // DEBUG_PRINT("Hello World!\n");
  }
}Code language: PHP (php)

Now, lets add our new controller. We will not add a real implementation here as it would be a bit too large for this post, instead we will just call into the PID controller to make sure the Crazyflie still can fly. Add this code after appMain() in my_controller.c.

// The new controller goes here --------------------------------------------
// Move the includes to the the top of the file if you want to
#include "controller.h"

// Call the PID controller in this example to make it possible to fly. When you implement you own controller, there is
// no need to include the pid controller.
#include "controller_pid.h"

void controllerOutOfTreeInit() {
  // Initialize your controller data here...

  // Call the PID controller instead in this example to make it possible to fly
  controllerPidInit();
}

bool controllerOutOfTreeTest() {
  // Always return true
  return true;
}

void controllerOutOfTree(control_t *control, const setpoint_t *setpoint, const sensorData_t *sensors, const state_t *state, const uint32_t tick) {
  // Implement your controller here...

  // Call the PID controller instead in this example to make it possible to fly
  controllerPid(control, setpoint, sensors, state, tick);
}
Code language: PHP (php)

Finally we need to tell the firmware that we have implemented the out-of-tree controller and that it should be added to the list. We do this by adding CONFIG_CONTROLLER_OOT to the app-config file. When you are done it should look like this:

CONFIG_APP_ENABLE=y
CONFIG_APP_PRIORITY=1
CONFIG_APP_STACKSIZE=350
CONFIG_CONTROLLER_OOT=y

Testing it!

Build and flash the firmware to your Crazyflie:

make -j8
make cload

Start your Crazyflie and the python client. Connect the client to the Crazyflie and open the console log tab. Make sure you are running your app by looking for the line:

MYCONTROLLER: Waiting for activation ...Code language: HTTP (http)

Now let’s activate our new controller! Open the parameter tab, find the stabilizer group and the controller parameter. Set it to 5 and check the console log that the out-of-tree controller was activated:

CONTROLLER: Using OutOfTree (5) controllerCode language: HTTP (http)

That’s it! Your new controller is activated and the Crazyflie is ready to fly.

Note: In the client, the comment for the stabilizer.controller parameter will not contain the out-of-tree controller, and it will look like only values 0-4 are valid even though 5 also works.

Conclusions

In this post we have shown how to add a new controller to the Crazyflie firmware. The process for adding an estimator is very similar, and hopefully it should be easy to understand how to do it based on the example above.

As you can see, very little code (apart from the actual controller/estimator) is required to add your own controller or estimator, and we hope that it will enable you to put your energy into the actual control problem, rather than the nitty gritty details of the code.

Happy coding!

My name is Hanna, and I just started as an intern at Bitcraze. However, it is not my first time working with a drone or even the Crazyflie, so I’ll tell you a bit about how I ended up here.

The first time I used a drone, actually even a Crazyflie, was in a semester thesis at ETH Zurich in 2017, where my task was to extend a Crazyflie with a Parallel Ultra Low-Power (PULP) System-on-Chip (SoC) connected to a camera and external memory. This was the first prototype of the AI-deck you can buy here nowadays (as used here) :)

My next drone adventure was an internship at a company building tethered drones for firefighters – a much bigger system than the Crazyflie. I was in charge of the update system, so more on the firmware side this time. It was a very interesting experience, but I swore never to build a system with more than three microcontrollers in it again.

This and a liking for tiny and restricted embedded systems brought me back to the smaller drones again. I did my master thesis back at ETH about developing a PULP-based nano-drone (nano-drones are just tiny drones that fit approximately in the palm of your hand and use only around 10Watts of power, the category Crazyflies fit in) and some onboard intelligence for it. As a starting point, we used the Crazyflie, both for the hardware and the software. It turned out to be a very hard task to port the firmware to a processor with only a very basic operating system at that time. Still, eventually I knew almost every last detail of the Crazyflie firmware, and it actually flew.

However, for this to happen, I needed some more time than the master thesis – in the meantime, I started to pursue a PhD at ETH Zurich. I am working towards autonomous miniaturized drones – so besides the part with the tiny PULP-based drone I already told you about, I also work on the “autonomous” part. Contrary to many other labs our focus is not only on novel algorithms though, we also work with novel sensors and processors. Two very interesting recent developments for us are a multi-zone Time-of-Flight sensor and the novel gap9 processor, which both fit on a Crazyflie in terms of power, size and weight. This enables new possibilities in obstacle avoidance, localization, mapping and many more. Last year my colleagues and I already posted a blog post about our newest advances in obstacle avoidance (here, with Videos!). More recently, we worked on onboard localization, using novel multi-zone Time-of-Flight sensors and the very new GAP9 processor to execute Monte Carlo localization onboard a Crazyflie (arxiv).

On the left you see an example of a multi-zone Time-of-Flight image (the background is a picture from the AI-deck), from here. On the right you see our prototype for localization in action – from our DATE23 paper (arxiv).

For me, localizing with a given map is a fascinating topic and one of the reasons I ended up in Sweden. It is one of the most basic skills of robots or even humans to navigate from A to B as fast as possible, and the basis of my favourite sport. The sport is called “orienteering” and is about running as fast as possible to some checkpoints on a map, usually through a forest. It is a very common sport in Sweden, which is the reason I started learning Swedish some years ago. So when the opportunity to go to Malmö for some months to join Bitcraze presented itself, I was happy to take it – not only because I like the company philosophy, but also because I just like to run around in Swedish forests :)

Now I am looking forward to my time here, I hope to learn lots about drones, firmware, new sensors, production, testing, company organization and to meet a lot of new nice people!

Greetings from Malmö – it can be a bit cold and rainy, but the sea and landscape are beautiful!

Hanna

This week’s guest blogpost is from Frederike Dümbgen presenting her latest work from her PhD project at the Laboratory of Audiovisual Communications (LCAV), EPFL, and is currently a Postdoc at the University of Toronto. Enjoy!

Bats navigate using sound. As a matter of fact, the ears of a bat are so much better developed than their eyes that bats cope better with being blindfolded than they cope with their ears being covered. It was precisely this experiment that helped the discovery of echolocation, which is the principle bats use to navigate [1]. Broadly speaking, in echolocation, bats emit ultrasonic chirps and listen for their echos to perceive their surroundings. Since its discovery in the 18th century, astonishing facts about this navigation system have been revealed — for instance, bats vary chirps depending on the task at hand: a chirp that’s good for locating prey might not be good for detecting obstacles and vice versa [2]. Depending on the characteristics of their reflected echos, bats can even classify certain objects — this ability helps them find, for instance, water sources [3]. Wouldn’t it be amazing to harvest these findings in building novel navigation systems for autonomous agents such as drones or cars?

Figure 1: Meet “Crazybat”: the Crazyflie equipped with our custom audio deck including 4 microphones, a buzzer, and a microcontroller. Together, they can be used for bat-like echolocation. The design files and firmware of the audio extension deck are openly available, as is a ROS2-based software stack for audio-based navigation. We hope that fellow researchers can use this as a starting point for further pushing the limits of audio-based navigation in robotics. More details can be found in [4].

The quest for the answer to this question led us — a group of researchers from the École Polytechnique Fédérale de Lausanne (EPFL) — to design the first audio extension deck for the Crazyflie drone, effectively turning it into a “Crazybat” (Figure 1). The Crazybat has four microphones, a simple piezo buzzer, and an additional microprocessor used to extract relevant information from audio data, to be sent to the main processor. All of these additional capabilities are provided by the audio extension deck, for which both the firmware and hardware design files are openly available.1

Video 1: Proof of concept of distance/angle estimation in a semi-static setup. The drone is moved using a stepper motor. More details can be found in [4].

In our paper on the system [4], we show how to use chirps to detect nearby obstacles such as glass walls. Difficult to detect using a laser or cameras, glass walls are excellent sound reflectors and thus a good candidate for audio-based navigation. We show in a first semi-static feasibility study that we can locate the glass wall with centimeter accuracy, even in the presence of loud propeller noise (Video 1). When moving to a flying drone and different kinds of reflectors, the problem becomes significantly more challenging: motion jitter, varying propeller noise and tight real-time constraints make the problem much harder to solve. Nevertheless, first experiments suggest that sound-based wall detection and avoidance is possible (Figure and Video 2).

Video 2: The “Crazybat” drone actively avoiding obstacles based on sound.
Figure 2: Qualitative results of sound-based wall localization on the flying “Crazybat” drone. More details can be found in [4].

The principle we use to make this work is sound-based interference. The sound will “bounce off” the wall, and the reflected and direct sound will interfere either constructively or destructively, depending on the frequency and distance to the wall. Using this same principle for the four microphones, both the angle and the distance of the closest wall can be estimated. This is however not the only way to navigate using sound; in fact, our software stack, available as an open-source package for ROS2, also allows the Crazybat to extract the phase differences of incoming sound at the four microphones, which can be used to determine the location of an external sound source. We believe that a truly intelligent Crazybat would be able to switch between different operating modes depending on the conditions, just like bats that change their chirps depending on the task at hand.

Note that the ROS2 software stack is not limited to the Crazybat only — we have isolated the hardware-dependent components so that the audio-based navigation algorithms can be ported to any platform. As an example, we include results on the small wheeled e-puck2 robot in [4], which shows better performance than the Crazybat thanks to the absence of propeller noise and motion jitter.

This research project has taught us many things, above all an even greater admiration for the abilities of bats! Dealing with sound is pretty hard and very different from other prevalent sensing modalities such as cameras or lasers. Nevertheless, we believe it is an interesting alternative for scenarios with poor eyesight, limited computing power or memory. We hope that other researchers will join us in the quest of exploiting audio for navigation, and we hope that the tools that we make publicly available — both the hardware and software stack — lower the entry barrier for new researchers. 

1 The audio extension deck works in a “plug-and-play” fashion like all other extension decks of the Crazyflie. It has been tested in combination with the flow deck, for stable flight in the absence of a more advanced localization system. The deck performs frequency analysis on incoming raw audio data from the 4 microphones, and sends the relevant information over to the Crazyflie drone where it is converted to the CRTP protocol on a custom driver and sent to the base station for further processing in the ROS2 stack.

References

[1] Galambos, Robert. “The Avoidance of Obstacles by Flying Bats: Spallanzani’s Ideas (1794) and Later Theories.” Isis 34, no. 2 (1942): 132–40. https://doi.org/10.1086/347764.

[2] Fenton, M. Brock, Alan D. Grinnell, Arthur N. Popper, and Richard R. Fay, eds. “Bat Bioacoustics.” In Springer Handbook of Auditory Research, 1992. https://doi.org/10.1007/978-1-4939-3527-7.

[3] Greif, Stefan, and Björn M Siemers. “Innate Recognition of Water Bodies in Echolocating Bats.” Nature Communications 1, no. 106 (2010): 1–6. https://doi.org/10.1038/ncomms1110.

[4] F. Dümbgen, A. Hoffet, M. Kolundžija, A. Scholefield and M. Vetterli, “Blind as a Bat: Audible Echolocation on Small Robots,” in IEEE Robotics and Automation Letters (Early Access), 2022. https://doi.org/10.1109/LRA.2022.3194669.

This year, the traditional Christmas video was overtaken by a big project that we had at the end of November: creating a test show with the help of CollMot.

First, a little context: CollMot is a show company based in Hungary that we’ve partnered with on a regular basis, having brainstorms about show drones and discussing possibilities for indoor drones shows in general. They developed Skybrush, an open- source software for controlling swarms. We have wanted to work with them for a long time.

So, when the opportunity came to rent an old train hall that we visit often (because it’s right next to our office and hosts good street food), we jumped on it. The place itself is huge, with massive pillars, pits for train maintenance, high ceiling with metal beams and a really funky industrial look. The idea was to do a technology test and try out if we could scale up the Loco positioning system to a larger space. This was also the perfect time to invite the guys at CollMot for some exploring and hacking.

The train hall

The Loco system

We added the TDoA3 Long Range mode recently and we had done experiments in our test-lab that indicate that the Loco Positioning systems should work in a bigger space with up to 20 anchors, but we had not actually tested it in a larger space.

The maximum radio range between anchors is probably up to around 40 meters in the Long Range mode, but we decided to set up a system that was only around 25×25 meters, with 9 anchors in the ceiling and 9 anchors on the floor placed in 3 by 3 matrices. The reason we did not go bigger is that the height of the space is around 7-8 meters and we did not want to end up with a system that is too wide in relation to the height, this would reduce Z accuracy. This setup gave us 4 cells of 12x12x7 meters which should be OK.

Finding a solution to get the anchors up to the 8 meters ceiling – and getting them down easily was also a headscratcher, but with some ingenuity (and meat hooks!) we managed to create a system. We only had the hall for 2 days before filming at night, and setting up the anchors on the ceiling took a big chunk out of the first day.

Drone hardware

We used 20 Crazyflie 2.1 equipped with the Loco deck, LED-rings, thrust upgrade kit and tattu 350 mAh batteries. We soldered the pin-headers to the Loco decks for better rigidity but also because it adds a bit more “height-adjust-ability” for the 350 mAh battery which is a bit thicker then the stock battery. To make the LED-ring more visible from the sides we created a diffuser that we 3D-printed in white PLA. The full assembly weighed in at 41 grams. With the LED-ring lit up almost all of the time we concluded that the show-flight should not be longer than 3-4 minutes (with some flight time margin).

The show

CollMot, on their end, designed the whole show using Skyscript and Skybrush Studio. The aim was to have relatively simple and easily changeable formations to be able to test a lot of different things, like the large area, speed, or synchronicity. They joined us on the second day to implement the choreography, and share their knowledge about drone shows.

We got some time afterwards to discuss a lot of things, and enjoy some nice beers and dinner after a job well done. We even had time on the third day, before dismantling everything, to experiment a lot more in this huge space and got some interesting data.

What did we learn?

Initially we had problems with positioning, we got outliers and lost tracking sometimes. Finally we managed to trace the problems to the outlier filter. The filter was written a long time ago and the current implementation was optimized for 8 anchors in a smaller space, which did not really work in this setup. After some tweaking the problem was solved, but we need to improve the filter for generic support of different system setups in the future.

Another problem that was observed is that the Z-estimate tends to get an offset that “sticks” and it is not corrected over time. We do not really understand this and will require more investigations.

The outlier filer was the only major problem that we had to solve, otherwise the Loco system mainly performed as expected and we are very happy with the result! The changes in the firmware is available in this, slightly hackish branch.

We also spent some time testing maximum velocities. For the horizontal velocities the Crazyflies started loosing positioning over 3 m/s. They could probably go much faster but the outlier filter started having problems at higher speeds. Also the overshoot became larger the faster we flew which most likely could be solved with better controller tuning. For the vertical velocity 3 m/s was also the maximum, limited by the deceleration when coming downwards. Some improvements can be made here.

Conclusion is that many things works really well but there are still some optimizations and improvements that could be made to make it even more robust and accurate.

The video

But, enough talking, here is the never-seen-before New Year’s Eve video

And if you’re curious to see behind the scenes

Thanks to CollMot for their presence and valuable expertise, and InDiscourse for arranging the video!

And with the final blogpost of 2022 and this amazing video, it’s time to wish you a nice New Year’s Eve and a happy beginning of 2023!

Santa is soon to be knocking on the door, hopefully with one or two exciting toys (with blinking LEDs) for us geeky people! There will not be a Christmas video in the Bitcraze gift this year, instead we’re wrapping up a new release that we hope will add to the Christmas fun!

We have been working on a secret project though and there might be a video for next week’s blog post showing what we have been up to…

The 2022.12 release

We are happy to announce that a new official release is out, 2022.12! We have mainly fixed bugs and stability issues but also added some new features, please see details below.

Crazyflie STM firmware (2022.12)

One of the main events in this release is that the Flapper Nimble+ has got official support with the flapper platform, it can now be flashed through the client like any other member of the Crazyflie family. A new controller, based on work by Brescianini has been added. The Kalman estimator and Lighthouse system have been tweaked to work better with the increased data volumes generated with 2+ base stations. Some improvements for brushless motors have been added. Finally there have been some general bug and stability fixes, including improvements for flashing of the AI-deck.

Please see the release notes for a list of all changes.

Crazyflie NRF firmware (2022.12)

The NRF firmware release mainly contains changes to support the new STM firmware.

Please see the release notes for a list of all changes.

Crazyflie lib python (0.1.21)

A blocking method has been added to upload trajectories to the high level commander, the various Uploader classes in the examples are not needed anymore. Stability and bug fixes related to deck flashing.

Please see the release notes for a list of all changes.

Crazyflie python Client (2022.12)

A button has been added in the console log tab to get statistics about persistent storage in the Crazyflie. The final traces of Windows and Mac builds have been cleaned out and some stability and bug fixes have been applied.

Please see the release notes for a list of all changes.

Hey, Victor here!

As some of you may know, I’ve worked at Bitcraze for two summers (2019, 2020), and I did my Bachelor’s thesis here during the spring this year. While we mentioned shortly that I started working on my thesis (here), I never presented the results of it, so I thought that I’d do that now! Better late than never, right?

So, during my thesis I built a prototype deck for the Crazyflie which contained five multizone lidar sensors (VL53L5CX) and an ESP32-S3. The VL53L5CX sensors can output distances to a 8×8 grid, with a 45 degrees FoV at a rate of 15 hz. The purpose of the ESP32-S3 was to collect the data from the sensors and send it to a ground control station, either with WiFi, or, with the nRF radio on the Crazyflie. While the ESP32-S3 is quite overkill for only collecting data and send it, we weren’t sure of how much data that would be gathered from the sensors, so to be on the safe side we rolled with the ESP32-S3. Both the sensors and the microcontroller was very new at the time so it seemed like a good oportunity to try them out.

I designed the schematic in KiCad and got a lot of help from everyone here at Bitcraze while doing so, especially Tobias. Once the schematic was done I designed the PCB, ordered the components and then waited eagerly for the stuff to arrive. Once everything had arrived, I soldered all components and assembled the deck. I then wrote some firmware for the ESP32-S3, and the STM32 on the Crazyflie, and at last I wrote a simple GUI in PyQt to help visualize the data, both in 2D and 3D.

The deck was quite successful and while the GUI was very far from perfect, I think it did show that the deck has some nice potential and it was very cool to see the 3D point cloud in realtime while flying the Crazyflie! I tried sending the data over WiFi which worked perfectly well, and I also tried sending it through the nRF on the Crazyflie with the help of CPX, which also worked pretty well.

If you’re more curious about the thesis, feel free to check it out here, and the github repository can be found here.

I finished the thesis in the beginning of the summer, and I have been working part time here at Bitcraze since September and I’ve truly been loving! I think it’s been really cool to become a part of the team and work more on the regular stuff that the rest of the team does. It has been very interesting to see how the team works and cooperates on a daily basis. Something that striked me was just how many products and different features and services we handle here, with only six people!

Fortunately and unfortunately, I will be moving to Gothenburg next week which means that my time at Bitcraze is over, for this time. I have learned a lot from everyone here and truly appreciate all the love and support, which actually started before I even started my Bachelor’s degree.

Cheers and (early) Merry Christmas,
Victor