Category: Research

Drones can perform a wide range of interesting tasks, from crop inspection to search-and-rescue. However, to make drones practically attractive they should be safe and cheap. Drones can be made safer by reducing their size and weight. This causes less damage in a collision with people or the environment. Additionally, being cheap means that the drones can take more risk – as it is less expensive to lose one – or that they can be deployed in larger numbers.

To function autonomously, such a drone should at least have some basic navigation capabilities. External position references such as GPS or UWB beacons can provide these, but such a reference is not always available. GPS is not accurate enough in indoor settings, and beacons require prior access to the area of operation and also add an additional cost.

Without these references, navigation becomes tricky. The typical solution is to have the drone construct a map of its local environment, which it can then use to determine its position and trajectories towards important places. But on tiny drones, the on-board computational resources are often too limited to construct such a map. How, then, can these tiny drones navigate? A subquestion of this – how to follow previously traversed routes – was the topic of my MSc thesis under supervision of Kimberly McGuire and Guido de Croon at TU Delft, and my PhD studies. The solution has recently been published in Science Robotics – “Visual route following for tiny autonomous robots” (TU Delft mirror here).

Route following

In an ideal world, route following can be performed entirely by odometry: the measurement and recording of one’s own movements. If a drone would measure the distance and direction it traveled, it could just perform the same movements in reverse and end up at its starting place. In reality, however, this does not entirely work. While current-day movement sensors such as the Flow deck are certainly accurate, they are not perfect. Every time a measurement is taken, this includes a small error. And in order to traverse longer distances, multiple measurements are summed, which causes the error to grow impractically large. It is this integration of errors that stops drones from using odometry over longer distances.

The trick to traveling longer distances, is to prevent this buildup of errors. To do so, we propose to let the drone perform ‘visual homing’ maneuvers. Visual homing is a control strategy that lets an agent return to a location where it has previously taken a picture, called a ‘snapshot’. In order to find its way back, the agent compares its current view of the environment to the snapshot that it took earlier. The trick here is that the difference between these two images smoothly grows with distance. Conversely, if the agent can find the direction in which this difference decreases, it can follow this direction to converge back to the snapshot’s original location.

The difference between images smoothly increases with their distance.

So, to perform long-distance route following, we now command the drone to take snapshots along the way, in addition to odometry measurements. Then, when retracing the route, the drone will routinely perform visual homing maneuvers to align itself with these snapshots. Because the error after a homing maneuver is bounded, there is now no longer a growing deviation from the intended path! This means that long-range route following is now possible without excessive drift.

Implementation

The above mentioned article describes the strategy in more detail. Rather than repeat what is already written, I would like to give a bit more detail on how the strategy was implemented, as this is probably more relevant for other Crazyflie users.

The main difference between our drone and an out-of-the-box one, is that our drone needs to carry a camera for navigation. Not just any camera, but the method under investigation requires a panoramic camera so that the drone can see in all directions. For this, we bought a Kogeto Dot 360. This is a cheap aftermarket lens for an older iPhone that provides exactly the field-of-view that we need. After a bit of dremeling and taping, it is also suitable for drones.

ARDrone 2 with panoramic camera lens.

The very first visual homing experiments were performed on an ARDrone 2. The drone already had a bottom camera, to which we fitted the lens. Using this setup, the drone could successfully navigate back to the snapshot’s location. However, the ARDrone 2 hardly qualifies as small as it is approximately 50cm wide, weighs 400 grams and carries a Linux computer.

To prove that the navigation method would indeed work on tiny drones, the setup was downsized to a Crazyflie 2.0. While this drone could take off with the camera assembly, it would become unstable very soon as the battery level decreased. The camera was just a bit too heavy. Another attempt was made on an Eachine Trashcan, heavily modified to support both the camera, a flowdeck and custom autopilot firmware. While this drone had more than enough lift, the overall reliability of the platform never became good enough to perform full flight experiments.

After discussing the above issues, I was very kindly offered a prototype of the Crazyflie Brushless to see if it would help with my experiments. And it did! The Crazyflie brushless has more lift than the regular platform and could maintain a stable attitude and height while carrying the camera assembly, all this, with a reasonable flight time. Software-wise it works pretty much the same as the regular Crazyflie, so it was a pleasure to work with. This drone became the one we used for our final experiments, and was even featured on the cover of the Science Robotics issue.

With the hardware finished, the next step was to implement the software. Unlike the ARDrone 2 which had a full Linux system with reasonable memory and computing power, the Crazyflie only has an STM32 microcontroller that’s also tasked with the flying of the drone (plus an nRF SoC, but that is out of scope here). The camera board developed for this drone features an additional STM32. This microcontroller performed most of the image processing and visual homing tasks at a framerate of a few Hertz. However, the resulting guidance also has to be followed, and this part is more relevant for other Crazyflie users.

To provide custom behavior on the Crazyflie, I used the app layer of the autopilot. The app layer allows users to create custom code for the autopilot, while keeping it mostly decoupled from the underlying firmware. The out-of-tree setup makes it easier to use a version control system for only the custom code, and also means that it is not as tied to a specific firmware version as an in-tree process.

The custom app performs a small number of crucial tasks. Firstly, it is responsible for communication with the camera. Communication with the camera was performed over UART, as this was already implemented in the camera software and this bus was not used for other purposes on the Crazyflie. Over this bus, the autopilot could receive visual guidance for the camera and send basic commands, such as the starting and stopping of image captures. Pprzlink was used as the UART protocol, which was a leftover from the earlier ARDrone 2 and Trashcan prototypes.

The second major task of the app is to make the drone follow the visual guidance. This consisted of two parts. Firstly, the drone should be able to follow visual homing vectors. This was achieved using the Commander Framework, part of the Stabilizer Module. Once the custom app was started, it would enter an infinite loop which ran at a rate of 10 Hertz. After takeoff, the app repeatedly calls commanderSetSetpoint to set absolute position targets, which are found by adding the latest homing vector to the current position estimate. The regular autopilot then takes care of the low-level control that steers the drone to these coordinates.

The core idea of our navigation strategy is that the drone can correct its position estimate after arriving at a snapshot. So secondly, the drone should be able to overwrite its position estimate with the one provided by the route-following algorithm. To simplify the integration with the existing state estimator, this update was implemented as an additional position sensor – similar to an external positioning system. Once the drone had converged to a snapshot, it would enqueue the snapshot’s remembered coordinates as a position measurement with a very small standard deviation, thereby essentially overwriting the position estimate but without needing to modify the estimator. The same trick was also used to correct heading drift.

The final task of the app was to make the drone controllable from a ground station. After some initial experiments, it was determined that fully autonomous flight during the experiments would be the easiest to implement and use. To this end, the drone needed to be able to follow more complex procedures and to communicate with a ground station.

Because the cfclient provides most of the necessary functions, it was used as the basis for the ground station. However, the experiments required extra controls that were of course not part of a generic client. While it was possible to modify the cfclient, an easier solution was offered by the integrated ZMQ server. This server allows external programs to communicate with the stock cfclient over a tcp connection. Among the possibilities, this allows external programs to send control values and parameters to the drone. Since the drone would be flying autonomously and therefore low-frequencies would suffice, the choice was made to let the ground station set parameters provided by the custom app. To simplify usability, a simple GUI was made in python using the CFZmq library and Tkinter. The GUI would request foreground priority such that it would be shown on top of the regular client, making it easy to use both at the same time.

Cfclient with experimental overlay (bottom right).

To perform more complex experiments, each experiment was implemented as a state machine in the custom app. Using the (High-level) Commander Framework and the navigation routines described above, the drone was able to perform entire experiments from take-off to landing.

While the code is very far from production quality, it is open source and can be viewed here to see how everything was implemented: https://github.com/tomvand/2020-visualhoming-crazyflie . The PCB used to fit Crazyflie decks to the Eachine Trashcan can be found here: https://github.com/tomvand/cf-deck-carrier .

Outcome

Using the hardware and software described above, we were able to perform the route-following experiments. The drone was commanded to fly a preprogrammed trajectory using the Flow deck, while recording odometry and snapshot images. Then, the drone was commanded to follow the same route in reverse, by traveling short sections using dead reckoning and then using visual homing to correct the incurred drift.

As shown in the article, the error with respect to the recorded route remained bounded. Therefore, we can now travel long routes without having to worry about drift, even under strict hardware limitations. This is a great improvement to the autonomy of tiny robots.

I hope that this post has given a bit more insight into the implementation behind this study, a part that is not often highlighted but very interesting for everyone working in this field.

As 2024 comes to an end, it’s the perfect time to reflect on what we’ve accomplished over the past year. A major highlight has been our work on the Crazyflie 2.1 Brushless. We’re thrilled that it will be available early in the new year! While much of our efforts focused on refining and preparing the platform as a whole, we also introduced some standout features like support for contact charging on a charging pad, perfecting the specially optimized motors, and propeller guards to enhance safety for both users and the drone.

Finalizing the integration of the Crazyflie 2.1 Brushless into our software ecosystem and expanding its documentation were key steps in preparing for its launch. These efforts ensure compatibility, improve the user experience, and make the platform more accessible to the community. We’re looking forward to a smooth launch and to seeing how the community will utilize the new platform!

This year, we introduced updates to the Crazyflie 2.1 kit, making the 47-17 propellers the new default and including an improved battery. These upgrades enhance flight performance and endurance, culminating in the release of the Crazyflie 2.1+—an optimized iteration of our established platform.

The Crazyflie 2.1 Brushless featured on the cover of Science Robotics vol. 9, no. 92

Community

In 2024, Bitcraze had an action-packed year, engaging with the robotics community through numerous conferences, workshops, and live events.

In May, we attended ICRA 2024 in Yokohama. We collected several research posters that now proudly feature at the office. Kimberly presented at the Robotics Developer Day, where she won Best Speaker Award for her impressive live hardware demos with ROS2. We co-organized the ‘Aerial Swarm Tools and Applications’ workshop at RSS 2024 in Delft. Arnaud and Kimberly shared insights on demo-driven development on an episode of OpenCV Live!. Additionally, we had a booth at ROSCon ’24 in Odense, connecting with the vibrant ROS community and showcasing our latest developments.

And don’t forget the developer meetings, where we shared some more behind the scenes information and collected invaluable feedback from the community.

We also released a new edition of our research compilation video, showcasing some of the coolest projects from 2023 and 2024 that highlight the versatility and impact of the Crazyflie platform in research.

Team

In the past year, Bitcraze saw significant changes within the team. in February, Rik rejoined the team. Tove started at Bitcraze in April. Mandy, with whom we’ve already worked extensively over the years, joined as our production representative in Shenzen. At the end of the year, we said goodbye to Kimberly, whose contributions will be deeply missed. Additionally, we had Björn with us for a few months, working on his master’s thesis on fault detection, and Joe continued his industrial postdoc at Bitcraze that began in December 2023. Looking ahead, Bitcraze is hiring for two new roles: a Technical Sales Lead and a Technical Success Engineer, to support our ongoing projects and customer collaborations.


As we close the chapter on 2024, we’re proud of the progress we’ve made, the connections we’ve strengthened, and the milestones we’ve reached. With exciting launches, new faces on the team, and continued collaboration with our community, we’re ready to soar to even greater heights in 2025. Thank you for being part of our journey!

Today, we’re excited to share research from Vrije Universiteit Amsterdam, ‘From Shadows to Light,’ which presents an innovative swarm robotics approach where nano-drones autonomously track dynamic sources indoors.

Motivation

In dynamic and unpredictable indoor environments, locating moving sources—such as heat, gas, or light—presents unique challenges. GPS-denied settings, in particular, demand innovative and efficient onboard solutions for both control and sensing. Our research demonstrates how small drones, like Crazyflies, can be organized into a coordinated swarm to autonomously locate and follow these sources indoors, relying solely on onboard sensing and communication capabilities. Without sharing individual measurements, each drone adapts its behavior in response to its own sensor readings, allowing the swarm to collectively converge on the center of a light source through modified interactions with nearby agents.

Tugay Alperen (right) and Victor Retamal (left) during ICRA 2024 poster session

Method

Our approach enables each Crazyflie to function autonomously, using onboard sensing combined with continuous inter-agent communication at a frequency of 20 Hz. This methodology is structured around three core components:

Proximal Control and Collective Motion

Each drone broadcasts its position to nearby agents, enabling the calculation of relative positions to maintain safe distances. This proximal control ensures cohesive group movement by computing virtual force vectors for velocity commands, which are sent to onboard controllers operating at 20 Hz.

Source Seeking Through Adaptive Social Proximity

Drones use custom light sensors to detect local light intensity. Instead of directly adjusting positions based on this measurement, each drone modifies its social proximity to neighbors according to the sensed intensity without broadcasting this information. This adaptation allows the swarm to collectively follow the light gradient toward the source in a decentralized manner.

Obstacle Avoidance

Equipped with time-of-flight sensors, each drone independently detects obstacles and adjusts its trajectory to maintain safety. This ensures the swarm remains intact while navigating toward the source.

By combining continuous relative positioning, virtual force-based control, individual sensing, and adaptive social behavior, our methodology provides a robust framework for efficient source seeking in GPS-denied indoor environments.

Experimental Setup

Crazyflie equipped with Flow Deck v2, UWB Deck, Multi-Ranger Deck, and a custom-made deck that produces an analog voltage reading from an LDR for light intensity measurements.

The system architecture allowing us to achieve autonomous flocking and source localization with a swarm of Crazyflie

Our experiments take place in a 7×4.75-meter indoor arena with remotely controlled overhead light bulbs. These bulbs, activated individually or in pairs, create a moving light gradient. We tested our flocking swarm by initially positioning them at the edge of an illuminated area. As the light source shifted, we assessed the swarm’s performance by comparing their trajectories with the known centers of the illuminated areas without waiting for full convergence at each step. We also mapped our environment’s light intensity by moving a single Crazyflie randomly around the flight arena and recording the measurements to later merge on a single map to generate this light intensity heatmap.

The brightness values around the test environments, measured for each light source when only it was active.

Results

The flock flies as an ordered swarm, successfully localizing around the source with the swarm’s centroid positioned at the source center. (The centroid appears as a point without an arrow in the video.)

Even with an obstacle present within or between the illuminated regions, the flock successfully localizes around the center, avoiding the obstacle and maintaining order and cohesion within the swarm. The Multi-Ranger deck provides distance measurements for obstacle detection.

Future Directions

As the next step, we plan to apply our highly generalizable algorithm to various source types, including gas sources, radio signals, and similar sources that provide only scalar strength measurements rather than directional cues. Additionally, we have demonstrated that our flocking and source localization algorithms work effectively in 3D. We aim to showcase a fully functional application with a 3D-localized source and a flocking swarm operating in 3D space. Finally, we are working toward achieving fully onboard relative localization, which would eliminate the need for any indoor positioning system. This advancement would allow our swarm to operate autonomously in any environment, replicating the same behavior wherever it is deployed.

Links

The authors were with the Vrije Universiteit Amsterdam.

Please feel free to contact us with any questions or ideas: t.a.karaguzel@vu.nl

Please cite this as:

@ARTICLE{10314746,
  author={Karagüzel, Tugay Alperen and Retamal, Victor and Cambier, Nicolas and Ferrante, Eliseo},
  journal={IEEE Robotics and Automation Letters}, 
  title={From Shadows to Light: A Swarm Robotics Approach With Onboard Control for Seeking Dynamic Sources in Constrained Environments}, 
  year={2024},
  volume={9},
  number={1},
  pages={127-134},
  keywords={Robot sensing systems;Autonomous aerial vehicles;Position measurement;Vehicle dynamics;Sensors;Location awareness;Drones;Swarm robotics;aerial systems: perception and autonomy;multi-robot systems},
  doi={10.1109/LRA.2023.3331897}}

It’s now become a tradition to create a video compilation showcasing the most visually stunning research projects that feature the Crazyflie. Since our last update, so many incredible things have happened that we felt it was high time to share a fresh collection.

As always, the toughest part of creating these videos is selecting which projects to highlight. There are so many fantastic Crazyflie videos out there that if we included them all, the final compilation would last for hours! If you’re interested, you can find a more extensive list of our products used in research here.

The video covers 2023 and 2024 so far. We were once again amazed by the incredible things the community has accomplished with the Crazyflie. In the selection, you can see the broad range of research subjects the Crazyflie can be a part of. It has been used in mapping, or swarms – even in heterogeneous swarms! With its small size, it has also been picked for human-robot interaction projects (including our very own Joseph La Delfa showcasing his work). And it’s even been turned into a hopping quadcopter!

Here is a list of all the research that has been included in the video:

But enough talking, the best way to show you everything is to actually watch the video:

A huge thank you to all the researchers we reached out to and who agreed to showcase their work! We’re especially grateful for the incredible footage you shared with us—some of it was new to us, and it truly adds to the richness of the compilation. Your contributions help highlight the fantastic innovations happening within the Crazyflie community. Let’s hope the next compilation also shows projects with the Brushless!

A few weeks ago, the prestigious Robotics: Science and Systems (RSS) conference was held at Delft University of Technology. We helped with the co-organization of a half-day tutorial and workshop called “Aerial Swarm Tools and Applications” so Kimberly (I) was there on behalf of both Bitcraze and Crazyswarm2. In this blog post, we will tell you a bit about the conference itself and the workshop (and perhaps also a tiny bit about RoboCup)

The Robotics: Science and Systems conference

The Robotics: Science and Systems conference, also known as RSS, is considered one of the most important robotics conferences to attend, alongside ICRA and IROS. It distinguishes itself by having only a single track of presented papers, which makes it possible for all attendees to listen to and learn about all the cool robotics work done in a wide range of fields. It also makes it more difficult to get a paper accepted due to the fixed number of papers they can accept, so you know that whatever gets presented is of high quality.

This year the topic was very much on large language models (LLMs) and their application in robotics, most commonly manipulators. Many researchers are exploring the ways that LLMs could be used for robotics, but that means not a lot of small and embedded systems were represented in these papers. We did find one paper where Crazyflies were presented, namely the awesome work by Darrick et al. (2024) called ‘Stein Variational Ergodic Search’ which used optimal control for path planning to achieve the best coverage.

It gave us the chance to experience many of the other works that could be found at RSS. One in particular was about the robotic design of the cute little biped from Disney Imagineering named “Design and Control of a Bipedal Robotic Character” by Grandia et al. (2024). Also very impressive was the Agile flight demo by the group of Davide Scaramuzza, and we enjoyed listening to the keynote by Dieter Fox, senior director at Nvidia, talking about ‘Where is RobotGPT?’. The banquet location was also very special, as it was located right in the old church of Delft.

You can find all the talks, demos, and papers on the website of RSS 2024

Photos of day 3 of RSS

Aerial Swarm Workshop

The main reason we joined RSS was that we were co-organizing the workshop ‘Aerial Swarm Tools and Applications’. This was done in collaboration with Wolfgang Hönig from Crazyswarm2/TU Berlin, Miguel Fernandez Cortizas and Rafel Perez Segui from Aerostack2/Polytechnic University of Madrid (UPM), and Andrea Testa, Lorenzo Pichierri, and Giuseppe Notarstefano from CrazyChoir/University of Bologna. The workshop was a bit of a hybrid as it contained both talks on various aerial swarm applications and tutorials on the different aerial swarm tools that the committee members were representatives of.

Photos of the Aerial Swarm Tools and Applications workshop

Sabine Hauert from the University of Bristol started off the workshop by talking about “Trustworthy swarms for large-scale environmental monitoring.” Gábor Vásárhelyi from Collmot Robotics and Eötvös University gave a talk/tutorial about Skybrush, showing its suitability not only for drone shows but also for research (Skybrush was used for the Big Loco Test show demo we did 1.5 years ago). The third speaker was SiQi Zhou, speaking on behalf of Angela Schöllig from TU Munich, discussing “Safe Decision-Making for Aerial Swarms – From Reliable Localization to Efficient Coordination.” Martin Saska concluded the workshop with his talk “Onboard relative localization for agile aerial swarming in the wild” about their work at the Czech TU in Prague. They also organize the Multi-robot systems summer school every year, so if you missed it this year, make sure to mark it in your calendar for next summer!

We had four tutorials in the middle of the workshop as well. Gábor also showed Skybrush in simulation after his talk for participants to try out. Additionally, we had tutorials that included real, flying Crazyflies live inside the workshop room! It was a bit of a challenge to set up due to the size of the room we were given, but with the lighthouse system it all worked out! Miguel and Rafael from Aerostack2 were first up, showing a leader-follower demo. Next up were Wolfgang and Kimberly (Crazyswarm2) who showed three Crazyflies collaboratively mapping the room, and finally, Andrea and Lorenzo from CrazyChoir demoed formation control in flight.

You can see the Crazyflies demos flying during the tutorials in the video below. The recording of each of the talks can be found on the workshops website: https://imrclab.github.io/workshop-aerial-swarms-rss2024/

RoboCup 2024 Eindhoven

Luckily, there was also a bit of time to visit Eindhoven for a field trip to the 2024 edition of the world championship competitions of RoboCup! This is a very large robotics competition held in several different divisions, namely Soccer (with many subdivisions), Industrial, Rescue, @Home, and Junior. Each country usually has its own national championships, and those that win there can compete in the big leagues at events like these. RoboCup was extremely fun to attend, so if any robotics enthusiasts happen to live close to one of these, go! It’s awesome.

Photos of the field trip to RoboCup

Maybe drone competitions might be one of RoboCup’s divisions in the future :)

Today we welcome Sam Schoedel and Khai Nguyen from Carnegie Mellon University. Enjoy!

We’re excited to share the research we’ve been doing on model-predictive control (MPC) for tiny robots! Our goal was to find a way to compress an MPC solver to a size that would fit on common microcontrollers like the Crazyflie’s STM32F405 while being fast enough to control the higher frequency dynamics of smaller robots. We came up with a few tricks to make that happen and dubbed the resulting solver TinyMPC. When it came time for hardware experiments, using the Crazyflie just made sense. A tiny solver deserves a tiny robot.

Motivation

Model predictive control is a powerful tool for controlling complex systems, but it is computationally expensive and thus often limited to use cases where the robot can either carry enough computational power or when offboard computing is available. The problem becomes challenging to solve for small robots, especially when we want to perform all of the computation onboard. Smaller robots have inherently faster dynamics which require higher frequency controllers to stabilize, and because of their size they don’t have the capacity to haul around as much computational power as their larger robot counterparts. The computers they can carry are often highly memory-constrained as well. Our question was “how can we shrink the computational complexity and memory costs of MPC down to the scale of tiny robots?”

What We Did

We settled on developing a convex model predictive control solver based on the alternating direction method of multipliers. Convex MPC solvers are limited to reasoning about linear dynamics (on top of any other convex constraints), but have structure that TinyMPC exploits to solve problems efficiently. The tricks we used to achieve this efficiency are described in the paper, but it boils down to rewriting the problem as a constrained linear-quadratic regulator to reduce the memory footprint and then precomputing as many matrices as possible offline so that online calculations are less expensive. These tricks allowed us to fit long-time horizon MPC problems on the Crazyflie and solve them fast enough for real-time use.

What TinyMPC Can Do

We decided to demonstrate the constraint-handling capabilities of TinyMPC by having the Crazyflie avoid a dynamic obstacle. We achieved this by re-computing hyperplane constraints (green planes in the first video) about a spherical obstacle (transparent white ball) for each knot point in the trajectory at every time step, and then by solving the problem with the new constraints assuming they stayed fixed for the duration of the solve.

In the two videos below, the reference trajectory used by the solver is just a hover position at the origin for every time step. Also, the path the robot takes in the real world will never be exactly the same as the trajectory computed by the solver, which can easily result in collisions. To avoid this, we inflated the end of the stick (and the simulated obstacle) to act as a keep-out region.

TinyMPC is restricted to reasoning about linear dynamics because of its convex formulation. However, a simple linearization can be taken pretty far. We experimented with recovering from different starting conditions to push the limits of our linear Crazyflie model and were able to successfully recover from a 90 degree angle while obeying the thrust commands for each motor.

We recently added support for second-order cone constraints as well. These types of constraints allow TinyMPC to reason about friction and thrust cones, for example, which means it can now intelligently control quadrupeds on slippery surfaces and land rockets. To clearly demonstrate the cone constraint, we took long exposure photos of the Crazyflie tracking a cylindrical landing trajectory without any cone constraints (red) and then with a spatial cone constraint that restricts the landing maneuver to a glide slope (blue).

How To Use TinyMPC

All of the information regarding the solver can be found on our website and GitHub org (which is where you can also find the main GitHub repository). TinyMPC currently has a Python wrapper that allows for validating the solver and generating C++ code to run on a robot, and we have a few examples in C++ if you don’t want to use Python. Our website also explains how to linearize your robot and has some examples for setting up the problem with a linear model, solving it an MPC loop, and then generating and running C++ code.

Most importantly to the Crazyflie community, our TinyMPC-integrated firmware is available and should work out of the box. Let us know if you use it and run into issues!

Our accompanying research papers:

Khai Nguyen, Sam Schoedel, Anoushka Alavilli, Brian Plancher, and Zachary Manchester. “TinyMPC: Model-Predictive Control on Resource-Constrained Microcontrollers.” arXiv preprint arXiv:2310.16985 (2023). https://arxiv.org/pdf/2310.16985

Sam Schoedel, Khai Nguyen, Elakhya Nedumaran, Brian Plancher, and Zachary Manchester. “Code Generation for Conic Model-Predictive Control on Microcontrollers with TinyMPC.” arXiv preprint arXiv:2403.18149 (2024). https://arxiv.org/pdf/2403.18149

We would love your feedback and suggestions, and let us know if you use TinyMPC for your tiny platforms!

This week we have a guest blogpost by Kamil Masalimov (MSc) and Tagir Muslimov (PhD) of the Ufa University of Science and Technology. Enjoy!

As researchers passionate about UAV technology, we are excited to share our recent findings on how structural defects affect the performance of nano-quadcopters. Our study, titled “CrazyPAD: A Dataset for Assessing the Impact of Structural Defects on Nano-Quadcopter Performance,” offers comprehensive insights that could greatly benefit the Crazyflie community and the broader UAV industry.

The Motivation Behind Our Research

Understanding the nuances of how structural defects impact UAV performance is crucial for advancing the design, testing, and maintenance of these devices. Even minor imperfections can lead to significant changes in flight behavior, affecting stability, power consumption, and control responsiveness. Our goal was to create a robust dataset (CrazyPAD) that documents these effects and can be used for further research and development.

Key Findings from Our Study

We conducted a series of experiments by introducing various defects, such as added weights and propeller cuts (Figure 1), to nano-quadcopters. For the experiments, we used the Lighthouse Positioning System with two SteamVR 2.0 virtual reality stations (Figure 2).

Figure 1. Propeller with two side defects
Figure 2. Schematic of the experimental setup with Lighthouse Positioning System

Here are some of the pivotal findings from our research:

  1. Stability Impact: We observed that both added weights and propeller cuts lead to noticeable changes in the stability of the quadcopter. Larger defects caused greater instability, emphasizing the importance of precise manufacturing and regular maintenance.
  2. Increased Power Consumption: Our experiments showed that structural defects result in higher power consumption. This insight is vital for optimizing battery life and enhancing energy efficiency during flights.
  3. Variable Control Responsiveness: We used the standard deviation of thrust commands as a measure of control responsiveness. The results indicated that defects increased the variability of control inputs, which could affect maneuverability and flight precision.
  4. Changes in Roll and Pitch Rates: The study also highlighted variations in roll and pitch rates due to structural defects, providing a deeper understanding of how these imperfections impact flight dynamics.

We show Figure 3 as an example of a graph obtained from our dataset. In this figure, you can see the altitude and thrust command over time for different flight conditions. The blue line represents the normal flight, while the orange line represents the flight with additional weight near the M3 propeller. In Figure 4, you can see the 3D flight trajectory of the Crazyflie 2.1 quadcopter under the cut_propeller_M3_2mm condition with the corrected ideal path. The blue line represents the actual flight trajectory, while the red dashed line with markers represents the ideal trajectory. Figure 5 shows the Motor PWM values over time for the add_weight_W1_near_M3 condition. The plot shows the PWM values of each motor (M1, M2, M3, and M4) as they respond to the added weight near the M3 propeller.

More examples of graphs obtained from the CrazyPAD dataset can be found in our research paper specifically describing this dataset: https://doi.org/10.3390/data9060079

Figure 3. Altitude and thrust command over time for different flight conditions
Figure 4. 3D flight trajectory of the Crazyflie 2.1
Figure 5. Motor PWM values over time

Leveraging Research for Diagnostic and Predictive Models

One of the most exciting aspects of our research is its potential application in developing diagnostic and predictive models. The CrazyPAD dataset can be utilized to train machine learning algorithms that detect and predict structural defects in real-time. By analyzing flight data, these models can identify early signs of wear and tear, allowing for proactive maintenance and reducing the risk of in-flight failures.

Diagnostic models can continuously monitor the performance of a UAV, identifying anomalies and pinpointing potential defects. This real-time monitoring can significantly enhance the reliability and safety of UAV operations.

Predictive models can forecast future defects based on historical flight data. By anticipating when and where defects are likely to occur, these models can inform maintenance schedules, ensuring UAVs are serviced before issues become critical.

Why This Matters for the Crazyflie Community

The CrazyPAD dataset and our findings offer valuable resources for the Crazyflie community. By understanding how different defects affect flight performance, developers and enthusiasts can improve design protocols, enhance testing procedures, and ensure higher safety and performance standards for their UAVs.

We believe that sharing our research with the Crazyflie community can lead to significant advancements in UAV technology. The dataset we created is open under the MIT License for further exploration and can serve as a foundation for new innovations and improvements.

Get Involved and Explore Further

We invite community members to explore our full research article and the CrazyPAD dataset. Together, we can drive forward the standards of UAV technology, ensuring that Crazyflie remains at the forefront of innovation and excellence.

Our research paper with a detailed description of this dataset:

Masalimov, K.; Muslimov, T.; Kozlov, E.; Munasypov, R. CrazyPAD: A Dataset for Assessing the Impact of Structural Defects on Nano-Quadcopter Performance. Data 2024, 9, 79. https://doi.org/10.3390/data9060079

Dataset:  https://github.com/AerialRoboticsUUST/CrazyPAD

We are eager to collaborate with the Crazyflie community and welcome any feedback or questions regarding our research. Let’s work together to push the boundaries of what’s possible in UAV technology.

As we mentioned earlier, ICRA Yokohama was full of exciting encounters – we really enjoyed meeting researchers, tech companies, and enthusiastic roboticists during those 4 days.

One challenge was to bring back as many research posters featuring the Crazyflies as possible. The goal was to decorate the walls of the office with them, as a “hall of fame”. And I’m really, really proud to show you how it turned up!

This was before
And this is now!

In total, we received 6 new posters. Here they are:


Optimal Collaborative Transportation for Under-Capacitated Vehicle Routing Problems using Aerial Drone swarms
Akash Kopparam Sreedhara, Deepesh Padala, Shashank Mahesh, Kai Cui, Mengguang Li, Heinz Koeppl

This paper presents a strategy for optimizing the collaborative transportation of payloads in an under-capacitated vehicle routing scenario. The Crazyflies work together to dynamically adjust routes based on real-time data and transport capacities, and collaborate to lift and transport heavier payloads.


From Shadows to Light: A Swarm Robotics Approach With Onboard Control for Seeking Dynamic Sources in Constrained Environments
T. A. Karagüzel, V. Retamal, N. Cambier and E. Ferrante

The paper describes a method for enabling a swarm of Crazyflies to dynamically seek and locate a moving target or source in constrained, GNSS-denied environments. Using a simple rule-based approach, the drones track dynamic source gradients and navigate obstacles autonomously with fully onboard systems.


CrazySim: A Software-in-the-Loop Simulator for the Crazyflie Nano Quadrotor
Christian Llanes, Zahi Kakish, Kyle Williams, and Samuel Coogan

We actually already have a blogpost presenting this paper, and we’re so happy to have it represented in our office now!


TinyMPC: Model-Predictive Control on Resource-Constrained Microcontrollers
Khai Nguyen, Sam Schoedel, Anoushka Alavilli, Brian Plancher, Zachary Manchester

The paper presents TinyMPC, a high-speed model-predictive control (MPC) solver designed for resource-constrained microcontrollers on small robots like the Crazyflie. TinyMPC efficiently handles real-time trajectory tracking and dynamic obstacle avoidance, outperforming traditional solvers.


Robust and Efficient Depth-Based Obstacle Avoidance for Autonomous Miniaturized UAVs
H. Müller, V. Niculescu, T. Polonelli, M. Magno and L. Benini

The paper introduces a lightweight obstacle avoidance system for nano quadcopters, leveraging a novel 64-pixel multizone time-of-flight (ToF) sensor to safely and effectively navigate complex indoor environments. Tested on the Crazyflie 2.1, the system achieves 100% reliability at a speed of 0.5 m/s, all while using only 0.3% of the onboard processing power, demonstrating its suitability for autonomous operations in unexplored settings.


Fully onboard Low-power localization with semantic sensor fusion on a Nano-UAV using floor plans
Nicky Zimmerman, Hanna Müller, Michele Magno, Luca Benini

This paper introduces a method for autonomous localization in nano-sized UAVs like the Crazyflie by fusing geometric data from time-of-flight sensors with semantic information extracted from images. The approach leverages annotated floor plans to improve navigation accuracy without adding extra deployment costs. The system operates efficiently with limited onboard computational resources, achieving a 90% success rate in real-world office environments.


A big thanks, once again, to all of those who gave us their posters!

“What? You are in Japan? Again!?”. Yup that is right! We loved IROS Kyoto 2022 so much that we just couldn’t wait to come back again. Barbara, Arnaud and Rik are setting up the booth as we speak to show some Bitcraze awesomeness to you! Come and say hi at booth IC085.

The gang before the rush starts!

Crazyflie Brushless and Camera expansion

Of all the prototypes we are the most excited of showing you the Crazyflie Brushless and the ‘forward facing expansion connector prototype’ aka the Camera deck. Here you can see them both in action at a tryout of our demo. We have also written blogposts about both so make sure to read them as well (Brushless blogpost, Camera expansion blogpost)

The Crazyflie Brushless flying with a Camera deck.

Also we will explain about the contact charging prototype (see the blogpost here) and will be showing all of our decks at the booth as well. And of course our fully autonomous, onboard, decentralized peer-to-peer and avoiding swarm demo will be displayed as always. Make sure to read this blogpost of when we showed this demo at IROS 2022 to understand what is fully going on!

Also take a look at our event page of the ICRA 2024 demo.

Hand in your Crazyflie posters at our booth!

We will be providing a ‘special disposal service’ for your conference poster! We would love to see what you are working on and get your poster, because we have a lot of space in our updated office/flight space but a lot of empty walls.

If you hand in your poster at the booth, you’ll get a special, one-of-a-kind, button badge that you can wear proudly during the conference! So we will see you at booth IC085!

The ‘Bitcraze took my poster’ button!

ICRA Yokohama

From the beginning of the company, we’ve always loved to join in at conferences. Only at a conference do you get the opportunity to show our products, meet our users or other tech-oriented people, learn about what others are doing, and let’s not forget the chance to discover a new place!

This year, we’ll be present at ICRA Yokohama – it’s in just 3 weeks. We’ll have a booth there (IC085 if you’re looking for us). We’ll be showing our autonomous demo with a twist just like we have shown last time, so please check the event page. This demo is extremely impressive and we’ve been improving on it each time we’ve shown it – beginning in our latest Japan trip and lastly at the last ICRA too. What’s new?

We’re really excited to be showing that and receive feedback, but also in hearing about what our users have been doing. ICRA is always a perfect place to catch up on all the amazing papers and publications featuring our hardware, and we couldn’t be prouder of all the cool stuff we’ve seen so far. We’re so proud, in fact, that we want to be able to show off! So, if you have a paper or a publication featured at ICRA, let us know – you can write us an email at contact@bitcraze.io, leave a comment below this post, or pass by our booth.

In fact, we’re prepared to make a deal. If you have a nice poster featuring our products and don’t know what to do with it once you’ve presented it, pass by our booth! We’re ready to swap them for something extra special. We plan to have a “hall of fame” at the office featuring your awesome work – in fact, it’s an idea we had last ICRA when someone just offered us their posters. Now, we’d like to cover our walls with them!

The corridor leading to the kitchen – we have space to show off the awesomeness!

So, whether you’re a seasoned conference-goer or a first-time attendee, don’t hesitate to wsing by our booth, say hello, and discover our newest demo! We hope to see you there.

Dev meeting

Next developer meeting is going to be on the 8th of May – we traditionally have a dev meeting every first Wednesday of the month, but this time it happens to be on the 1st of May which is a holiday here in Sweden. So already prepare your calendar for the 8th of May at 15.00 CET, and stay tuned for more info on which topic we’ll talk about!

Crazyflies back in stock !

You may have noticed that the Crazyflies have been out of stock for some time now. After some adventures, we are now fully back in stock with most of our bundles and products available in the shop!