Category: Community

We’ve got an exciting month ahead – in just a few weeks, we’re heading off to not one, but two amazing conferences! It’s going to be a whirlwind, but we couldn’t be more thrilled to be part of these events, meet fellow robotics enthusiasts, and show off some cool demos. Here’s where you’ll find us:

First stop: ICUAS

We’re kicking things off with ICUAS (International Conference on Unmanned Aircraft Systems), where we’re proud to be official sponsors of the competition. We’ll be present there to help and support the constestants of the competition, that are going to use the Crazyflies in simulation and in real life. The teams will need to deploy a team of UAVs in an urban environment to locate and identify threats.

It’s our first time attending ICUAS, so this is a brand new adventure for us – and we can’t wait to dive in and see what it’s all about!

Next up: ICRA

Just two days after ICUAS wraps up, we’re heading straight to ICRA – this year taking place in Atlanta. You’ll be able to find us at booth 131, right in front of the Tech Talk stage. If you’re attending, definitely come say hi!

We had the honour to be invited to be part of the workshop “25 years of arial robotics: challenges and opportunities“. Rik will talk there on the 23th of May at 16.10; covering Bitcraze’s history and the challenges we’ve faced in positioning a nanocopter – all in just 10 minutes. We’ll also take part in the forum on Undergraduate Robotics Education Programs on the 22th of May. We’ll have a poster presenting the Crazyflie as an educational platform.

These are all fantastic opportunities to highlight what makes our platform special and to exchange ideas with you! If you’ve got a paper or publication featured at ICRA, we’d love to hear about it – email us at contact@bitcraze.io, leave a comment below this post, or drop by our booth.

Demo

We’re bringing back our trusted demo setup – but this time, with more Brushless units and charging docks! It will be a version between what we presented at the last ICRA and what we call “the fish tank demo” we have now at the office.

We’ll also be bringing along some prototypes and new decks we’re currently working on – so if you’re curious about what’s coming next for Crazyflie, this is your chance to get a sneak peek and chat with us about it!

Give us your posters!

Last year, we collected posters from proud participants to decorate the office, and it turned out amazing – so we’re doing it again! If you’ve got a cool poster featuring our products and aren’t sure what to do with it after your presentation, come by our booth. We’d love to swap it for something a little extra special.

All in all, it’s shaping up to be a busy, exciting, and (hopefully) couple of weeks. Whether you’re at ICUAS or ICRA, stop by, chat with us, and see the Crazyflies in action. We’re looking forward to reconnecting with old friends and meeting new ones – see you there!

Human Robot Interaction (HRI) is a conference that brings together academics and industry partners to explore how humans are interacting with the latest developments in robotics. The conference is held yearly and brings together the many relevant disciplines concerned with the “H” part (cognitive science, neuroscience), the “R” part (computer science, engineering) and the I part (social psychology, education, anthropology and most recently, design).

This year it was in Melbourne (my home city) and I was so grateful to be given the chance to demonstrate a system from my PhD studies called “How To Train Your Drone” in what was its final hurrah, a retirement party! Running the demo was a pleasure, especially with the supportive and curious HRI crowd at such a well organised event .

The take home message from this demonstration was this:

If you let the end user shape, with their hands, the sensory field of the drone, they then end up with an in-depth understanding of it. This allows the user to creatively explore how the drone relates to themselves and their surrounding environment.

What do we mean by sensory field? Its the area around the drone where it can “feel” the presence of those hand-mounted sensors, represented by the grey and red spheres in the figure below. Initially, the drone has no spheres and therefore cannot respond at all to the user’s movement. But by holding the hands still for a few seconds the user can create a spherical space, relative to the drone where the drone can sense their hands and follow them.

These spheres are “part of the drone’s body”, and so they move with the drone. So in a way you are kind of deciding where the drone can “feel” whilst also piloting it. Should it be sensitive in the space immediately in front of it? Or either side of it?

But shouldn’t it just be everywhere?

Good question! We think the answer is no, and for two reasons:

  1. What we can and cannot sense as humans is what makes us human. It also allows us to understand other humans. E.g. We don’t deliver verbal information directly into other people’s ears at max volume because we have ears and we know that sucks. Nor do we demonstrate how to perform a task to someone with their back turned to us. So by the same token, knowing how a machine senses the world also teaches us how to communicate with it. Furthermore, shaping how a machine can sense the world allows us to creatively explore what we can do with it.
  2. To quote science writer Ed Yong, “Nothing can sense everything and nothing needs to”. Meaning we can get the job done without having to ingest insane amounts of data and even more insane amounts of compute. By cumulatively building an agent’s capacity, in context, with end users, we could actually end up with agents that are hyper specialised and resource efficient. A big plus for resource constrained systems like the Crazyflie and our planet at large.

If you are interested in reading more about this research then please check out this paper (if you like to read) or this pictorial (if you like to look at pictures). Or just reach out in the comments below!

This week in Germany

This week, some of us are on an adventure!
Marcus and Tobias will be exploring both the RIG and Embedded World fairs.

RIG showcases the latest innovations in robotics and intelligent systems, while Embedded World is the place to be for cutting-edge embedded technologies. Both events promise amazing demos, insightful talks, and a chance to catch up with some of our collaborators.

Planning to attend either fair? Let’s meet up! We’d love to explore the exhibitions together, chat about cool technologies, or just geek out about the innovations on display. We’ll be wandering through Embedded World on Thursday and hitting RIG on Friday. Send us an email if you’d like to connect – we’re always up for grabbing coffee!

Next May in Atlanta

After our adventures as visitors, we’re thrilled to announce that we’ll be exhibiting at the International Conference on Robotics and Automation (ICRA) 2025! Stop by our booth where we’ll be showcasing our latest demo. We’ll be, as always, available to discuss our newest products, answer your technical questions, and provide insights into how our solutions can transform your robotics applications. We’re also eager to hear your thoughts on what you’d like to see in our upcoming products. Mark your calendars and make sure to find us at Booth #131 – we may even have some presentations in the work, but nothing confirmed yet.

Today in the shop

And, last but not least, the Brushless is now available in a Swarm configuration! Both the Lighthouse Swarm bundle and Loco Swarm bundle have been added to our shop. These new bundles feature all the same components as our standard Swarm packages, but come equipped with the Crazyflie 2.1 Brushless instead of the Crazyflie 2.1+ model.

Marcus and I are going to visit FOSDEM 2025 at the end of the week. This is a great open-source conference that I visit every year but this year there is a twist: I am part of the organisation of the Robotics and Automation devroom! I am going to give the welcome talk there:

FOSDEM is a conference with many tracks, the main track and devrooms. Devrooms are like mini-conferences: they are handled by a committee that produces a call for participation and handles the schedule for the room. FOSDEM allocates a time slot, a physical room, and video recording for the devroom so that all talks are broadcasted in real-time and recorded.

Since my first visit to FOSDEM in 2015, we have been thinking about the lack of a dedicated devroom for robotics: a lot of robotics, at least in research, is open source. This is in part thanks to ROS, which allows for easily sharing modules and algorithms between projects, but it also applies to things like flight stacks that are often open-source. So we took it upon ourselves to organize what we wanted, a robotics-dedicated devroom.

We started last year, at FOSDEM 2024, by organizing a robotics Bird of Feather with Kimberly. These are impromptu meetups that can be organized by booking a time on the spot for a couple of dedicated rooms. There, we had some really nice discussions with fellow robotics enthusiasts and figured out that there was indeed quite some interest in robotics at FOSDEM and that we were enough interested parties to organize a devroom.

If you’re interested in open source and/or robotics and you can be in Brussels, Belgium, on the weekend of the 1st and 2nd of February 2025, please join us! The Robotics and Simulation devroom is on Sunday afternoon. I will also be monitoring our Mastodon channel more carefully, so do not hesitate to poke me if you want to meet either me or Marcus, as we will be at the conference both days.

As 2024 comes to an end, it’s the perfect time to reflect on what we’ve accomplished over the past year. A major highlight has been our work on the Crazyflie 2.1 Brushless. We’re thrilled that it will be available early in the new year! While much of our efforts focused on refining and preparing the platform as a whole, we also introduced some standout features like support for contact charging on a charging pad, perfecting the specially optimized motors, and propeller guards to enhance safety for both users and the drone.

Finalizing the integration of the Crazyflie 2.1 Brushless into our software ecosystem and expanding its documentation were key steps in preparing for its launch. These efforts ensure compatibility, improve the user experience, and make the platform more accessible to the community. We’re looking forward to a smooth launch and to seeing how the community will utilize the new platform!

This year, we introduced updates to the Crazyflie 2.1 kit, making the 47-17 propellers the new default and including an improved battery. These upgrades enhance flight performance and endurance, culminating in the release of the Crazyflie 2.1+—an optimized iteration of our established platform.

The Crazyflie 2.1 Brushless featured on the cover of Science Robotics vol. 9, no. 92

Community

In 2024, Bitcraze had an action-packed year, engaging with the robotics community through numerous conferences, workshops, and live events.

In May, we attended ICRA 2024 in Yokohama. We collected several research posters that now proudly feature at the office. Kimberly presented at the Robotics Developer Day, where she won Best Speaker Award for her impressive live hardware demos with ROS2. We co-organized the ‘Aerial Swarm Tools and Applications’ workshop at RSS 2024 in Delft. Arnaud and Kimberly shared insights on demo-driven development on an episode of OpenCV Live!. Additionally, we had a booth at ROSCon ’24 in Odense, connecting with the vibrant ROS community and showcasing our latest developments.

And don’t forget the developer meetings, where we shared some more behind the scenes information and collected invaluable feedback from the community.

We also released a new edition of our research compilation video, showcasing some of the coolest projects from 2023 and 2024 that highlight the versatility and impact of the Crazyflie platform in research.

Team

In the past year, Bitcraze saw significant changes within the team. in February, Rik rejoined the team. Tove started at Bitcraze in April. Mandy, with whom we’ve already worked extensively over the years, joined as our production representative in Shenzen. At the end of the year, we said goodbye to Kimberly, whose contributions will be deeply missed. Additionally, we had Björn with us for a few months, working on his master’s thesis on fault detection, and Joe continued his industrial postdoc at Bitcraze that began in December 2023. Looking ahead, Bitcraze is hiring for two new roles: a Technical Sales Lead and a Technical Success Engineer, to support our ongoing projects and customer collaborations.


As we close the chapter on 2024, we’re proud of the progress we’ve made, the connections we’ve strengthened, and the milestones we’ve reached. With exciting launches, new faces on the team, and continued collaboration with our community, we’re ready to soar to even greater heights in 2025. Thank you for being part of our journey!

We are excited to announce that we are working on several new link performance metrics for the Crazyflie that will simplify the troubleshooting of communication issues. Until now, users have had access to very limited information about communication links, relying primarily on a “link quality” statistic based on packet retries (when we have to re-send data) and an RSSI channel scan. Our nightly tests have been limited to basic bandwidth and latency testing. With this update, we aim to expose richer data that not only enables users to make more informed decisions regarding communication links but also enhances the effectiveness of our nightly testing process. In this blog post, we will explore the new metrics, the rationale behind their introduction, and how they will improve your interaction with the Crazyflie. Additionally, we will be holding a developer meeting on Wednesday November 13th to discuss these updates in more detail, and we encourage you to join us!

“Link Quality”—All or Nothing

Until now, users of the Crazyflie have had access to a single link quality metric. Implemented in the Python library, this metric is based on packet retries—instances when data packets need to be re-sent due to communication issues. This metric indicates that for every retry, the link quality drops by 10%, with a maximum of 3 retries allowed. As a result, the link quality score usually ranges from 70% to 100%, with a drop to 0% when communication is completely lost. However, as packet loss occurs, users often experience a steep decline, commonly seeing 100% when packets are successfully acknowledged or dropping to 0% when communication is completely lost.

Client representation of link quality; no link, yes link

The current link quality metric has served as a basic indicator but provides limited insight, often making it difficult to gauge communication reliability accurately. Recognizing these limitations, we’re introducing several new link performance metrics to the Crazyflie Python library, designed to provide a far more detailed and actionable view of communication performance.

What’s Coming in the Upcoming Update

The first metric we are adding is latency. We measure the full link latency, capturing the round-trip time through the library, to the Crazyflie, and back. This latency measurement is link-independent, meaning it applies to both radio and USB connections. The latency metric exposed to users will reflect the 95th percentile—a commonly used measure for capturing typical latency under normal conditions.

Next are several metrics that (currently) only support the radio link. For these, we distinguish between uplink (from the radio to the Crazyflie) and downlink (from the Crazyflie to the radio).

The first is packet rate, which simply measures the number of packets sent and received per second.

More interestingly, we are introducing a link congestion metric. Whenever there is no data to send, both the radio and the Crazyflie send “null” packets. By calculating the ratio of null packets to the total packets sent or received, we can estimate congestion. This is particularly useful for users who rely heavily on logging parameters or, for example, stream mocap positioning data to the Crazyflie.

The Received Signal Strength Indicator (RSSI) measures the quality of signal reception. Unlike our current “link quality” metric, we hope that a poor RSSI will serve as an early warning signal for potential communication loss. While RSSI tracking has been possible before with the channel scan example, this update will monitor RSSI in the library by default, and expose it to the user. The nRF firmware will also be updated to report RSSI by default. Currently, we only receive uplink RSSI, that is, RSSI measured on the Crazyflie side.

Work in progress client representation of new link performance metrics

We’ve already found these new metrics invaluable at Bitcraze. While we have, of course, measured various parameters throughout development, it was easy to lose track of the precise status of the communication stack. In the past, we relied more on general impressions of performance, but with these new metrics, we’ve gained a clearer picture. They’ve already shed light on areas like swarm latency, helping us fine-tune and understand performance far better than before.

You can follow progress on GitHub, and we invite you to try out these metrics for yourself. If there’s anything you feel is missing, or if you have feedback on what would make these tools even more helpful, we’d love to hear from you. Hit us up over on GitHub or join the developer meeting on Wednesday the 13th of November (see the join information on discussions).

We are happy to announce that release 2024.10 is now available! Special thanks to our community contributors for their valuable input and code contributions in this release!

Release overview

crazyflie-firmware release 2024.10 GitHub

crazyflie2-nrf-firmware release 2024.10 GitHub

crazyflie2-nrf-bootloader release 2024.10 GitHub

cfclient (crazyflie-clients-python) release 2024.10 GitHub, PyPI

cflib (crazyflie-lib-python) release 0.1.27 on GitHub, PyPI

User upgrade notice

While older versions may still function, users are encouraged to upgrade:

  • Minimum supported Python version changed to 3.10
  • Supported Ubuntu versions changed to 22.04 and 24.04

Major changes

  • Enhanced out-of-tree (OOT) kbuild configuration, allowing users to perform full Kconfig configuration for app layer applications.
  • Introduced recovery functionality, allowing users or scripts to safely re-enable the system after a crash without reboot.
  • Added a timeout for auto-disarming, allowing the system to remain armed during brief landings in manual arming mode.
  • Introduced a workaround for PID derivative kick, improving the performance of the PID controller during large setpoint changes (#1337, #1403).
  • Spiral and constant velocity high-level commander segments (#1410).
  • Changed BLE name format to include part of the NRF MAC address, allowing users to easily differentiate between Crazyflies.

For detailed release notes, check out the individual releases on GitHub. Links can be found in the release overview above.

We have some very busy weeks behind us and ahead! As we are working hard on releasing the new CF Brushless, we have been preparing for the upcoming ROSCon in Odense Denmark next week (see this previous blogpost) and we also featured on the latest OpenCV live episode as well! So more about both in this blogpost.

OpenCV Live! Demo Driven Development

We were featured as guests on the latest OpenCV Live! episode hosted by Phil Nelson and Satya Mallick, where we went through a bit of the history of the start of Bitcraze and all of the (crazy) demos done with the Crazyflie in the last decade. We have done a similar topic for our latest developer meeting, but for this episode we put the focus more on vision based demos, since OpenCV has been definitely used in the past at Bitcraze for various reasons! Just type in OpenCV in the top right search barto check out any of the blogs we have written.

During the OpenCV live episode of the 10th of October, Arnaud and Kimberly told the backstories of these demos that went from a manual flight fail where Arnaud flew the Crazyflie 1.0 in Marcus’ hair, using OpenCV and Aruco markers for positioning to flying a swarm in your kitchen. It was really fun to do and alos one lucky listener managed to answer the two questions the host Phil asked at the end, namely “Where does the name Crazyflie come from?” and “Why is the last part (‘-flie’) spelled this way?” and won a STEM ranging bundle. If you’d like to know the answers, go and watch the latest OpenCV! Live episode ;) Enjoy!

ROSCon – What to expect?

So next week we will be present as Silver Sponsor at ROSCon Odense, namely on Monday 21th and Wednesday 23rd of October. The Bitcraze booth will be located on number 21 so that should be near the coffee break place! We will have are old trusty cage with some upgrades with a nice ROS demo which is similar to the one explained in this Crazyflie ROS tutorial we have written a while ago, but then the swarming variant of it. We also hope to show a Brushless Crazyflie Prototype, and a new camera deck prototype, along with anything else we can find lying around at our office :D.

Moreover, Arnaud will be given a presentation on the lighthouse positioning system, namely at Wednesday 23rd of October 14:40 (2:30 pm) called ‘The Lighthouse project: from Virtual Reality to Onboard Positioning for Robotics’. The lighthouse positioning system will also be the system that we will demo at our booth so if you’d like to see it for yourself, or perhaps (during downtime) hack around together with us, you are more than welcome to do so! Check out the Bitcraze ROSCon Eventpage for more details about our demo or the hardware we will show.

It’s now become a tradition to create a video compilation showcasing the most visually stunning research projects that feature the Crazyflie. Since our last update, so many incredible things have happened that we felt it was high time to share a fresh collection.

As always, the toughest part of creating these videos is selecting which projects to highlight. There are so many fantastic Crazyflie videos out there that if we included them all, the final compilation would last for hours! If you’re interested, you can find a more extensive list of our products used in research here.

The video covers 2023 and 2024 so far. We were once again amazed by the incredible things the community has accomplished with the Crazyflie. In the selection, you can see the broad range of research subjects the Crazyflie can be a part of. It has been used in mapping, or swarms – even in heterogeneous swarms! With its small size, it has also been picked for human-robot interaction projects (including our very own Joseph La Delfa showcasing his work). And it’s even been turned into a hopping quadcopter!

Here is a list of all the research that has been included in the video:

But enough talking, the best way to show you everything is to actually watch the video:

A huge thank you to all the researchers we reached out to and who agreed to showcase their work! We’re especially grateful for the incredible footage you shared with us—some of it was new to us, and it truly adds to the richness of the compilation. Your contributions help highlight the fantastic innovations happening within the Crazyflie community. Let’s hope the next compilation also shows projects with the Brushless!

ROSCon is a developer’s conference that focuses entirely on the Robot Operating System (ROS), bringing together developers from around the globe to learn, discuss, and network. It serves as a space for ROS developers to explore the latest advancements, share their experiences, and collaborate on cutting-edge robotics projects. We attended ROSCon 2022 in Japan, and it was a fantastic experience. So when the opportunity came to participate again this year, we couldn’t pass it up! Not only is this a conference that’s been close to our hearts, this year it’s also close to the office: it’s merely a 3hours train ride away.

The 2024 edition is full of promises already, and we’re excited to be a part of it in several ways. We talked about how we helped out the diversity committee already, contributing to efforts that promote a more inclusive and diverse community within the robotics field. Moreover, we will have a booth there. We’ll be located at in Room 2, at booth 21. If you have trouble finding us, just listen closely to the sound of drones buzzing! ! We’ll be showcasing a live demo that’s still under construction. If you’re curious and want to know more about it, just keep an eye on our weekly blogposts to get an update once we finalize our plans.

In addition to being an exhibitor, we also have the honour of presenting a talk. Arnaud will be speaking on October 23 at 14:40 in Large Hall 4. His talk, titled “The Lighthouse Project: From Virtual Reality to Onboard Positioning for Robotics”, will dive into the Lighthouse system, as the title implies. He’ll explain how this technology, originally developed for virtual reality, is being adapted for onboard positioning in various types of robots.

We’re really looking forward to connecting with fellow developers, learning from the presentations, and sharing our own work with the community. If you’re attending ROSCon 2024, be sure to stop by Booth 21 and catch Arnaud’s talk—we can’t wait to see you there!