Category: Community

As 2024 comes to an end, it’s the perfect time to reflect on what we’ve accomplished over the past year. A major highlight has been our work on the Crazyflie 2.1 Brushless. We’re thrilled that it will be available early in the new year! While much of our efforts focused on refining and preparing the platform as a whole, we also introduced some standout features like support for contact charging on a charging pad, perfecting the specially optimized motors, and propeller guards to enhance safety for both users and the drone.

Finalizing the integration of the Crazyflie 2.1 Brushless into our software ecosystem and expanding its documentation were key steps in preparing for its launch. These efforts ensure compatibility, improve the user experience, and make the platform more accessible to the community. We’re looking forward to a smooth launch and to seeing how the community will utilize the new platform!

This year, we introduced updates to the Crazyflie 2.1 kit, making the 47-17 propellers the new default and including an improved battery. These upgrades enhance flight performance and endurance, culminating in the release of the Crazyflie 2.1+—an optimized iteration of our established platform.

The Crazyflie 2.1 Brushless featured on the cover of Science Robotics vol. 9, no. 92

Community

In 2024, Bitcraze had an action-packed year, engaging with the robotics community through numerous conferences, workshops, and live events.

In May, we attended ICRA 2024 in Yokohama. We collected several research posters that now proudly feature at the office. Kimberly presented at the Robotics Developer Day, where she won Best Speaker Award for her impressive live hardware demos with ROS2. We co-organized the ‘Aerial Swarm Tools and Applications’ workshop at RSS 2024 in Delft. Arnaud and Kimberly shared insights on demo-driven development on an episode of OpenCV Live!. Additionally, we had a booth at ROSCon ’24 in Odense, connecting with the vibrant ROS community and showcasing our latest developments.

And don’t forget the developer meetings, where we shared some more behind the scenes information and collected invaluable feedback from the community.

We also released a new edition of our research compilation video, showcasing some of the coolest projects from 2023 and 2024 that highlight the versatility and impact of the Crazyflie platform in research.

Team

In the past year, Bitcraze saw significant changes within the team. in February, Rik rejoined the team. Tove started at Bitcraze in April. Mandy, with whom we’ve already worked extensively over the years, joined as our production representative in Shenzen. At the end of the year, we said goodbye to Kimberly, whose contributions will be deeply missed. Additionally, we had Björn with us for a few months, working on his master’s thesis on fault detection, and Joe continued his industrial postdoc at Bitcraze that began in December 2023. Looking ahead, Bitcraze is hiring for two new roles: a Technical Sales Lead and a Technical Success Engineer, to support our ongoing projects and customer collaborations.


As we close the chapter on 2024, we’re proud of the progress we’ve made, the connections we’ve strengthened, and the milestones we’ve reached. With exciting launches, new faces on the team, and continued collaboration with our community, we’re ready to soar to even greater heights in 2025. Thank you for being part of our journey!

We are excited to announce that we are working on several new link performance metrics for the Crazyflie that will simplify the troubleshooting of communication issues. Until now, users have had access to very limited information about communication links, relying primarily on a “link quality” statistic based on packet retries (when we have to re-send data) and an RSSI channel scan. Our nightly tests have been limited to basic bandwidth and latency testing. With this update, we aim to expose richer data that not only enables users to make more informed decisions regarding communication links but also enhances the effectiveness of our nightly testing process. In this blog post, we will explore the new metrics, the rationale behind their introduction, and how they will improve your interaction with the Crazyflie. Additionally, we will be holding a developer meeting on Wednesday November 13th to discuss these updates in more detail, and we encourage you to join us!

“Link Quality”—All or Nothing

Until now, users of the Crazyflie have had access to a single link quality metric. Implemented in the Python library, this metric is based on packet retries—instances when data packets need to be re-sent due to communication issues. This metric indicates that for every retry, the link quality drops by 10%, with a maximum of 3 retries allowed. As a result, the link quality score usually ranges from 70% to 100%, with a drop to 0% when communication is completely lost. However, as packet loss occurs, users often experience a steep decline, commonly seeing 100% when packets are successfully acknowledged or dropping to 0% when communication is completely lost.

Client representation of link quality; no link, yes link

The current link quality metric has served as a basic indicator but provides limited insight, often making it difficult to gauge communication reliability accurately. Recognizing these limitations, we’re introducing several new link performance metrics to the Crazyflie Python library, designed to provide a far more detailed and actionable view of communication performance.

What’s Coming in the Upcoming Update

The first metric we are adding is latency. We measure the full link latency, capturing the round-trip time through the library, to the Crazyflie, and back. This latency measurement is link-independent, meaning it applies to both radio and USB connections. The latency metric exposed to users will reflect the 95th percentile—a commonly used measure for capturing typical latency under normal conditions.

Next are several metrics that (currently) only support the radio link. For these, we distinguish between uplink (from the radio to the Crazyflie) and downlink (from the Crazyflie to the radio).

The first is packet rate, which simply measures the number of packets sent and received per second.

More interestingly, we are introducing a link congestion metric. Whenever there is no data to send, both the radio and the Crazyflie send “null” packets. By calculating the ratio of null packets to the total packets sent or received, we can estimate congestion. This is particularly useful for users who rely heavily on logging parameters or, for example, stream mocap positioning data to the Crazyflie.

The Received Signal Strength Indicator (RSSI) measures the quality of signal reception. Unlike our current “link quality” metric, we hope that a poor RSSI will serve as an early warning signal for potential communication loss. While RSSI tracking has been possible before with the channel scan example, this update will monitor RSSI in the library by default, and expose it to the user. The nRF firmware will also be updated to report RSSI by default. Currently, we only receive uplink RSSI, that is, RSSI measured on the Crazyflie side.

Work in progress client representation of new link performance metrics

We’ve already found these new metrics invaluable at Bitcraze. While we have, of course, measured various parameters throughout development, it was easy to lose track of the precise status of the communication stack. In the past, we relied more on general impressions of performance, but with these new metrics, we’ve gained a clearer picture. They’ve already shed light on areas like swarm latency, helping us fine-tune and understand performance far better than before.

You can follow progress on GitHub, and we invite you to try out these metrics for yourself. If there’s anything you feel is missing, or if you have feedback on what would make these tools even more helpful, we’d love to hear from you. Hit us up over on GitHub or join the developer meeting on Wednesday the 13th of November (see the join information on discussions).

We are happy to announce that release 2024.10 is now available! Special thanks to our community contributors for their valuable input and code contributions in this release!

Release overview

crazyflie-firmware release 2024.10 GitHub

crazyflie2-nrf-firmware release 2024.10 GitHub

crazyflie2-nrf-bootloader release 2024.10 GitHub

cfclient (crazyflie-clients-python) release 2024.10 GitHub, PyPI

cflib (crazyflie-lib-python) release 0.1.27 on GitHub, PyPI

User upgrade notice

While older versions may still function, users are encouraged to upgrade:

  • Minimum supported Python version changed to 3.10
  • Supported Ubuntu versions changed to 22.04 and 24.04

Major changes

  • Enhanced out-of-tree (OOT) kbuild configuration, allowing users to perform full Kconfig configuration for app layer applications.
  • Introduced recovery functionality, allowing users or scripts to safely re-enable the system after a crash without reboot.
  • Added a timeout for auto-disarming, allowing the system to remain armed during brief landings in manual arming mode.
  • Introduced a workaround for PID derivative kick, improving the performance of the PID controller during large setpoint changes (#1337, #1403).
  • Spiral and constant velocity high-level commander segments (#1410).
  • Changed BLE name format to include part of the NRF MAC address, allowing users to easily differentiate between Crazyflies.

For detailed release notes, check out the individual releases on GitHub. Links can be found in the release overview above.

We have some very busy weeks behind us and ahead! As we are working hard on releasing the new CF Brushless, we have been preparing for the upcoming ROSCon in Odense Denmark next week (see this previous blogpost) and we also featured on the latest OpenCV live episode as well! So more about both in this blogpost.

OpenCV Live! Demo Driven Development

We were featured as guests on the latest OpenCV Live! episode hosted by Phil Nelson and Satya Mallick, where we went through a bit of the history of the start of Bitcraze and all of the (crazy) demos done with the Crazyflie in the last decade. We have done a similar topic for our latest developer meeting, but for this episode we put the focus more on vision based demos, since OpenCV has been definitely used in the past at Bitcraze for various reasons! Just type in OpenCV in the top right search barto check out any of the blogs we have written.

During the OpenCV live episode of the 10th of October, Arnaud and Kimberly told the backstories of these demos that went from a manual flight fail where Arnaud flew the Crazyflie 1.0 in Marcus’ hair, using OpenCV and Aruco markers for positioning to flying a swarm in your kitchen. It was really fun to do and alos one lucky listener managed to answer the two questions the host Phil asked at the end, namely “Where does the name Crazyflie come from?” and “Why is the last part (‘-flie’) spelled this way?” and won a STEM ranging bundle. If you’d like to know the answers, go and watch the latest OpenCV! Live episode ;) Enjoy!

ROSCon – What to expect?

So next week we will be present as Silver Sponsor at ROSCon Odense, namely on Monday 21th and Wednesday 23rd of October. The Bitcraze booth will be located on number 21 so that should be near the coffee break place! We will have are old trusty cage with some upgrades with a nice ROS demo which is similar to the one explained in this Crazyflie ROS tutorial we have written a while ago, but then the swarming variant of it. We also hope to show a Brushless Crazyflie Prototype, and a new camera deck prototype, along with anything else we can find lying around at our office :D.

Moreover, Arnaud will be given a presentation on the lighthouse positioning system, namely at Wednesday 23rd of October 14:40 (2:30 pm) called ‘The Lighthouse project: from Virtual Reality to Onboard Positioning for Robotics’. The lighthouse positioning system will also be the system that we will demo at our booth so if you’d like to see it for yourself, or perhaps (during downtime) hack around together with us, you are more than welcome to do so! Check out the Bitcraze ROSCon Eventpage for more details about our demo or the hardware we will show.

It’s now become a tradition to create a video compilation showcasing the most visually stunning research projects that feature the Crazyflie. Since our last update, so many incredible things have happened that we felt it was high time to share a fresh collection.

As always, the toughest part of creating these videos is selecting which projects to highlight. There are so many fantastic Crazyflie videos out there that if we included them all, the final compilation would last for hours! If you’re interested, you can find a more extensive list of our products used in research here.

The video covers 2023 and 2024 so far. We were once again amazed by the incredible things the community has accomplished with the Crazyflie. In the selection, you can see the broad range of research subjects the Crazyflie can be a part of. It has been used in mapping, or swarms – even in heterogeneous swarms! With its small size, it has also been picked for human-robot interaction projects (including our very own Joseph La Delfa showcasing his work). And it’s even been turned into a hopping quadcopter!

Here is a list of all the research that has been included in the video:

But enough talking, the best way to show you everything is to actually watch the video:

A huge thank you to all the researchers we reached out to and who agreed to showcase their work! We’re especially grateful for the incredible footage you shared with us—some of it was new to us, and it truly adds to the richness of the compilation. Your contributions help highlight the fantastic innovations happening within the Crazyflie community. Let’s hope the next compilation also shows projects with the Brushless!

ROSCon is a developer’s conference that focuses entirely on the Robot Operating System (ROS), bringing together developers from around the globe to learn, discuss, and network. It serves as a space for ROS developers to explore the latest advancements, share their experiences, and collaborate on cutting-edge robotics projects. We attended ROSCon 2022 in Japan, and it was a fantastic experience. So when the opportunity came to participate again this year, we couldn’t pass it up! Not only is this a conference that’s been close to our hearts, this year it’s also close to the office: it’s merely a 3hours train ride away.

The 2024 edition is full of promises already, and we’re excited to be a part of it in several ways. We talked about how we helped out the diversity committee already, contributing to efforts that promote a more inclusive and diverse community within the robotics field. Moreover, we will have a booth there. We’ll be located at in Room 2, at booth 21. If you have trouble finding us, just listen closely to the sound of drones buzzing! ! We’ll be showcasing a live demo that’s still under construction. If you’re curious and want to know more about it, just keep an eye on our weekly blogposts to get an update once we finalize our plans.

In addition to being an exhibitor, we also have the honour of presenting a talk. Arnaud will be speaking on October 23 at 14:40 in Large Hall 4. His talk, titled “The Lighthouse Project: From Virtual Reality to Onboard Positioning for Robotics”, will dive into the Lighthouse system, as the title implies. He’ll explain how this technology, originally developed for virtual reality, is being adapted for onboard positioning in various types of robots.

We’re really looking forward to connecting with fellow developers, learning from the presentations, and sharing our own work with the community. If you’re attending ROSCon 2024, be sure to stop by Booth 21 and catch Arnaud’s talk—we can’t wait to see you there!

You might remember that at the beginning of this summer, we were invited to do a skill-learning session with the Crazyflie at the Robotics Developer Day 2024 (see this blog post) organized by The Construct. We showed the Crazyflie flying with the multi-ranger deck, capable of mapping the room in both simulation and the real world. Moreover, we demonstrated this with both manual control and autonomous wall-following. Since then, we wanted to make some improvements to the simulation. We now present an updated tutorial on how to do all of this yourself on your own machine.

This tutorial will focus on using the multi-ranger ROS 2 nodes for both mapping and wall-following in simulation first, before trying it out on the real thing. You will be able to tune settings to your specific environment in simulation first and then use exactly the same nodes in the real world. That is one of the main strengths of ROS, providing you with that flexibility.

We have made a video of what to expect of the tutorial, for which you should use this blogpost for the more detailed instructions.

Watch this video first and then again with the instructions below

What do you need first?

You’ll need to setup some things first on the PC and acquire hardware to follow this tutorial in full:

PC preparation

You’ll need to install ROS 2 and Gazebo simulator maintained by the Open Robotics foundation on an Ubuntu machine.

  • Ubuntu 22.04 on a 64-bit x86 device (no ARM)
  • ROS 2 Humble – Install it via these instructions
  • Gazebo Harmonic – Install via these instructions This is not the recommended Gazebo for humble but we will install the specific ROS bridge for this later. Just make sure that you don’t have gazebo classic installed on your machine.

Hardware

You’ll need to components at least of the STEM ranging bundle

If you have any different setup of your computer or positioning system, it is okay as the demos should be simple enough to work, but, be prepared for some warning/error handling that this tutorial might have not covered.

Time to complete:

This is an approximation of how much time you need to complete this tutorial, depended on your skill level, but if you already have experience with both ROS 2/Gazebo and the Crazyflie it should take 1 hour.

If you have the Crazyflie for the first time, it would probably be a good idea to go through the getting started tutorial and connect to it with a CFclient with the Flowdeck and Multi-ranger deck attached as a sanity check if everything is working before jumping into ROS 2 and Gazebo.

Some things holds for ROS 2! It would be handy to go through the ROS 2 Humble beginner tutorials before starting.

1. Installation

This section will install 4 packages:

Make the workspaces for both simulation and ROS. You can use a different directory for this

mkdir ~/crazyflie_mapping_demo
cd crazyflie_mapping_demo
mkdir simulation_ws
mkdir ros2_ws
cd ros2_ws
mkdir src

Let’s clone the repositories in their right location, starting with simulation

cd ~/crazyflie_mapping_demo/simulation_ws
git clone https://github.com/bitcraze/crazyflie-simulation.gitCode language: JavaScript (javascript)

Then navigate to the ROS2 workspace source folder and clone 3 projects:

cd ~/crazyflie_mapping_demo/ros2_ws/src
git clone https://github.com/knmcguire/crazyflie_ros2_multiranger.git
git clone https://github.com/knmcguire/ros_gz_crazyflie
git clone https://github.com/IMRCLab/crazyswarm2 --recursiveCode language: PHP (php)

First install certain requirements as apt-get packages and pip libraries (might want to make a python environment for the latter)

sudo apt-get install libboost-program-options-dev libusb-1.0-0-dev python3-colcon-common-extensions
sudo apt-get install ros-humble-motion-capture-tracking ros-humble-tf-transformations
sudo apt-get install ros-humble-ros-gzharmonic ros-humble-teleop-twist-keyboard
pip3 install cflib transform3D Code language: JavaScript (javascript)

Also follow the instructions to give the proper rights to the Crazyradio 2.0 in this guide, but if this is your first time of working with the Crazyradio 2.0 first follow this tutorial.

Go to the ros2_ws workspace and build the packages

cd  ~/crazyflie_mapping_demo/ros2_ws/
source /opt/ros/humble/setup.bash
colcon build --cmake-args -DBUILD_TESTING=ONCode language: JavaScript (javascript)

Building will take a few minutes. Especially Crazyswarm2 will show a lot of warnings and std_err, but unless the package build has ‘failed’, just ignore it for now until we have proposed a fix to that repository.

If the build of all the packages passes and non failed, please continue to the next step!

2. Simple mapping simulation

This section will explain how to create a simple 2D map of your environment using the multi-ranger. The ROS 2 package designed for this is specifically made for the multi-ranger, but it should be compatible with NAV2 if you’d like. However, for now, we’ll focus on a simple version without any localization inferred from the map.

Open up a terminal which needs to be sourced for both the gazebo model and the newly build ROS 2 packages:

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
export GZ_SIM_RESOURCE_PATH="/home/$USER/crazyflie_mapping_demo/simulation_ws/crazyflie-simulation/simulator_files/gazebo/"Code language: JavaScript (javascript)

First lets be safe and start with simulation. Startup the ROS 2 launch files with:

ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_simulation.launch.pyCode language: CSS (css)

If you get a ‘No such file or directory’ error on the model, try entering the full path in GZ_SIM_RESOURCE_PATH export.

Gazebo will start with the Crazyflie in the center. You can get a close-up of the Crazyflie by right-clicking it in the Entity tree and pressing ‘Move to’. You can also choose to follow it, but the camera tracking feature of Gazebo needs some tuning to track something as small as the Crazyflie. Additionally, you will see RVIZ starting with the map view and transforms preconfigured.

Open up another terminal, source the installed ROS 2 distro and open up the ROS 2 teleop keyboard node:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Have the Crazyflie take off with ‘t’ on your keyboard, and rotate it around with the teleop instructions. In RVIZ you should see the map being created and the transform of the Crazyflie moving. You should be able to see this picture, and in this part of the video.

Screenshot of the Crazyflie in Gazebo generating a map with Teleop (video)

3. Simple mapping real world

Now that you got the gist of it, let’s move to the real Crazyflie!

First, if you have a different URI of the Crazyflie to connect to, first change the config file ‘crazyflie_real_crazyswarm2.yaml’ in the crazyflie_ros2_repository. This is a file that Crazyswarm2 uses to know to which Crazyflie to connect to.

Open up the config file in gedit or your favorite IDE like visual code:

gedit ~/crazyflie_mapping_demo/ros2_ws/src/crazyflie_ros2_multiranger/crazyflie_ros2_multiranger_bringup/config/crazyflie_real_crazyswarm2.yamlCode language: JavaScript (javascript)

and change the URI on this line specifically to the URI of your Crazyflie if necessary. Mind that you need to rebuild ros2_ws again to make sure that this has an effect.

Now source the terminal with the installed ROS 2 packages and the Gazebo model, and launch the ROS launch of the simple mapper example for the real world Crazyflie.

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
export GZ_SIM_RESOURCE_PATH="/home/$USER/crazyflie_mapping_demo/simulation_ws/crazyflie-simulation/simulator_files/gazebo/"
ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_real.launch.py
Code language: JavaScript (javascript)

Now open up another terminal, source ROS 2 and open up teleop:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Same thing, have the Crazyflie take off with ‘t’, and control it with the instructions.

You should be able to see this on your screen, which you can also check with this part of the video.

Screen shot of the real Crazyflie mapping while being controlled with ROS 2 teleop (video)

Make the Crazyflie land again with ‘b’, and now you can close the ROS 2 node in the launch terminal with ctrl + c.

4. Wall following simulation

Previously, you needed to control the Crazyflie yourself to create the map, but what if you could let the Crazyflie do it on its own? The `crazyflie_ros2_multiranger` package includes a `crazyflie_ros2_multiranger_wall_following` node that uses laser ranges from the multi-ranger to perform autonomous wall-following. Then, you can just sit back and relax while the map is created for you!

Let’s first try it in simulation, so open up a terminal and source it if you haven’t already (see section of the Simple mapper simulation). Then launch the wall follower ROS 2 launch file:

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_simulation.launch.pyCode language: CSS (css)

Take off and wall following will go fully automatic. The simulated Crazyflie in Gazebo will fly forward, stop when it sees a wall with it’s forward range sensor and follow the wall on its left-hand side.

You’ll see on RVIZ2 when the full map is created like here below and this part of the tutorial video.

Screenshot of the simulated Crazyflie in Gazebo mapping will autonomously wall following (video)

You can stop the simulated Crazyflie by the following service call in another terminal that is sourced with ROS 2 humble.

ros2 service call /crazyflie/stop_wall_following std_srvs/srv/Trigger

The simulated Crazyflie will stop wall following and land. You can also just close the simulation, since nothing can happen here.

5. Wall following real world

Now that we have demonstrated that the wall-following works in simulation, we feel confident enough to try it in the real world this time! Make sure you have a fully charged battery, place the Crazyflie on the floor facing the direction you’d like the positive x-axis to be (which is also where it will fly first), and turn it on.

Make sure that you are flying with a room with clear defined walls and corners, or make something with cardboard such as a mini maze, but the current algorithm is optimized to just fly in a squarish room.

Source the ROS 2 workspace like previously and start up the wall follower launch file for the

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_real.launch.pyCode language: CSS (css)

Like the simulated Crazyflie, the real Crazyflie will take off automatically and automatically do wall following, so it is important that it is flying towards a wall. It should look like this screenshot, or you can check it with this part of the video.

The real crazyflie wall following autonomously while mapping the room (video).

Be careful here to not accidently run this script with the Crazyflie sitting on your desk!

If you’d like the Crazyflie to stop, don’t stop the ROS2 nodes with ctrl-c, since it will continue flying until crash. It’s not like simulation unfortunately where you can close the environment and nothing will happen. Instead, use the ROS 2 service made for this in a different terminal:

ros2 service call /crazyflie_real/stop_wall_following std_srvs/srv/Trigger

Similar the real Crazyflie will stop wall following and land. Now you can close the ROS 2 terminals and turn off the crazyflie.

Next steps?

We don’t have any more demos to show but we can give you a list of suggestions of what you could try next! You could for instance have multiple Crazyflies mapping together like in the video shown here:

This uses the mapMergeForMultiRobotMapping-ROS2 external project, which is combined with Crazyswarm2 with this launch file gist. Just keep in mind that, currently, it would be better to use a global positioning system here, such as the Lighthouse positioning system used in the video. Also, if you’d like to try this out in simulation, you’ll need to ensure different namespaces for the Crazyflies, which the current simulation setup may not fully support.

Another idea is to connect the NAV2 stack instead of the simple mapper. There exists a couple of instructions on the Crazyswarm2 ROS2 tutorials so you can use those as reference. Check out the video below here.

Moreover, if you are having difficulties setting up your computer, I’d like to remind you that the skill-learning session we conducted for Robotics Developer Day was entirely done using a ROSject provided by The Construct, which also allows direct connection with the Crazyflie. The only requirement is that you can run Crazyswarm2 on your local machine, but that should be feasible. See the video of the original Robotics Developer Day skill-learning session here:

The last thing to know is that the ROS 2 nodes in this tutorial are running ‘offboard,’ so not on the Crazyflies themselves. However, do check out the Micro-ROS examples for the Crazyflie by Eprosima whenever you have the time and would like to challenge yourself with embedded development.

That’s it, folks! If you are running into any issues with this tutorial or want to bounce some cool ideas to try yourself, start a discussion thread on https://discussions.bitcraze.io/.

Happy hacking!

A few weeks ago, the prestigious Robotics: Science and Systems (RSS) conference was held at Delft University of Technology. We helped with the co-organization of a half-day tutorial and workshop called “Aerial Swarm Tools and Applications” so Kimberly (I) was there on behalf of both Bitcraze and Crazyswarm2. In this blog post, we will tell you a bit about the conference itself and the workshop (and perhaps also a tiny bit about RoboCup)

The Robotics: Science and Systems conference

The Robotics: Science and Systems conference, also known as RSS, is considered one of the most important robotics conferences to attend, alongside ICRA and IROS. It distinguishes itself by having only a single track of presented papers, which makes it possible for all attendees to listen to and learn about all the cool robotics work done in a wide range of fields. It also makes it more difficult to get a paper accepted due to the fixed number of papers they can accept, so you know that whatever gets presented is of high quality.

This year the topic was very much on large language models (LLMs) and their application in robotics, most commonly manipulators. Many researchers are exploring the ways that LLMs could be used for robotics, but that means not a lot of small and embedded systems were represented in these papers. We did find one paper where Crazyflies were presented, namely the awesome work by Darrick et al. (2024) called ‘Stein Variational Ergodic Search’ which used optimal control for path planning to achieve the best coverage.

It gave us the chance to experience many of the other works that could be found at RSS. One in particular was about the robotic design of the cute little biped from Disney Imagineering named “Design and Control of a Bipedal Robotic Character” by Grandia et al. (2024). Also very impressive was the Agile flight demo by the group of Davide Scaramuzza, and we enjoyed listening to the keynote by Dieter Fox, senior director at Nvidia, talking about ‘Where is RobotGPT?’. The banquet location was also very special, as it was located right in the old church of Delft.

You can find all the talks, demos, and papers on the website of RSS 2024

Photos of day 3 of RSS

Aerial Swarm Workshop

The main reason we joined RSS was that we were co-organizing the workshop ‘Aerial Swarm Tools and Applications’. This was done in collaboration with Wolfgang Hönig from Crazyswarm2/TU Berlin, Miguel Fernandez Cortizas and Rafel Perez Segui from Aerostack2/Polytechnic University of Madrid (UPM), and Andrea Testa, Lorenzo Pichierri, and Giuseppe Notarstefano from CrazyChoir/University of Bologna. The workshop was a bit of a hybrid as it contained both talks on various aerial swarm applications and tutorials on the different aerial swarm tools that the committee members were representatives of.

Photos of the Aerial Swarm Tools and Applications workshop

Sabine Hauert from the University of Bristol started off the workshop by talking about “Trustworthy swarms for large-scale environmental monitoring.” Gábor Vásárhelyi from Collmot Robotics and Eötvös University gave a talk/tutorial about Skybrush, showing its suitability not only for drone shows but also for research (Skybrush was used for the Big Loco Test show demo we did 1.5 years ago). The third speaker was SiQi Zhou, speaking on behalf of Angela Schöllig from TU Munich, discussing “Safe Decision-Making for Aerial Swarms – From Reliable Localization to Efficient Coordination.” Martin Saska concluded the workshop with his talk “Onboard relative localization for agile aerial swarming in the wild” about their work at the Czech TU in Prague. They also organize the Multi-robot systems summer school every year, so if you missed it this year, make sure to mark it in your calendar for next summer!

We had four tutorials in the middle of the workshop as well. Gábor also showed Skybrush in simulation after his talk for participants to try out. Additionally, we had tutorials that included real, flying Crazyflies live inside the workshop room! It was a bit of a challenge to set up due to the size of the room we were given, but with the lighthouse system it all worked out! Miguel and Rafael from Aerostack2 were first up, showing a leader-follower demo. Next up were Wolfgang and Kimberly (Crazyswarm2) who showed three Crazyflies collaboratively mapping the room, and finally, Andrea and Lorenzo from CrazyChoir demoed formation control in flight.

You can see the Crazyflies demos flying during the tutorials in the video below. The recording of each of the talks can be found on the workshops website: https://imrclab.github.io/workshop-aerial-swarms-rss2024/

RoboCup 2024 Eindhoven

Luckily, there was also a bit of time to visit Eindhoven for a field trip to the 2024 edition of the world championship competitions of RoboCup! This is a very large robotics competition held in several different divisions, namely Soccer (with many subdivisions), Industrial, Rescue, @Home, and Junior. Each country usually has its own national championships, and those that win there can compete in the big leagues at events like these. RoboCup was extremely fun to attend, so if any robotics enthusiasts happen to live close to one of these, go! It’s awesome.

Photos of the field trip to RoboCup

Maybe drone competitions might be one of RoboCup’s divisions in the future :)

Whenever we show the Crazyflie at our booth at various robotics conferences (like the recent ICRA Yokohama), we sometimes get comments like ‘ahh that’s cute’ or ‘that’s a fun toy!’. Those who have been working with it for their research know differently, but it seems that the general robotics crowd needs a little bit more… convincing! Disregarding its size, the Crazyflie is a great tool that enables users to do many awesome things in various areas of robotics, such as swarm robotics and autonomy, for both research and education.

We will be showing that off by giving a live tutorial and demonstration at the Robotics Developer Day 2024, which is organized by The Construct and will take place this Friday, 5th of July. We have a discount code for you to use if you want to get a ticket; scroll down for details. The code can be used until 12 am midnight (CEST) on the 2nd of July.

The Construct and Robotics Developer Day 2024

So a bit of background information: The Construct is an online platform that offers various courses and curriculums to teach robotics and ROS to their users. Along with that, they also organize all kinds of live training sessions and events like the Robotics Developer Day and the ROS Awards. Unfortunately, the deadline for voting in the latter has passed, but hopefully in the future, the Crazyflie might get an award of its own!

What stands out about the platform is its implementation of web-based virtual machines, called ‘ROSJects,’ where ROS and everything needed for it is already set up from the start. Anyone who has worked with ROS(2) before knows that it can be a pain to switch between different versions of ROS and Gazebo, so this feature allows users to keep those projects separate. For the ROS Developer Day, there will be about five live skill-learning sessions where a ROSject is already preconfigured and set up for the attendees, enabling them to try the tutorial simultaneously as the teacher or speaker explains the framework.

Skill learning session with the Crazyflie

One of the earlier mentioned skill learning sessions is, of course, one with the Crazyflie! The title is “ROS 2 with a Tiny Quadcopter,” and it is currently planned to be the first skill learning session of the event, scheduled at 15:15 (3:15 pm) CEST. The talk will emphasize the use of simulation in the development process with aerial robotics and iterating between the real platform and the simulated one. We will demonstrate this with a Crazyflie 2.1 equipped with a Lighthouse deck and a Multi-ranger deck. Moreover, it will also use a Qi-charging deck on a charging platform while it patiently waits for its turn :D

What we will be showing is a simple implementation of a mapping algorithm made specifically for the Crazyflie’s Multiranger deck, which we have demonstrated before at ROSCon Kyoto and in the Crazyswarm2 tutorials. What is especially different this time is that we are using Gazebo for the simulation parts, which required some skill learning on our side as we have been used to Webots over the last couple of years (see our tutorial for that). You can find the files for the simulation part in this repository, but we do advise you to follow the session first.

You can, if you want, follow along with the tutorial using a Crazyflie yourself. If you have a Crazyflie, Crazyradio, and a positioning deck (preferably Lighthouse positioning, but a Flowdeck would work as well), you can try out the real-platform part of this tutorial. You will need to install Crazyswarm2 on a separate Ubuntu machine and add a robot in your ROSject as preparation. However, this is entirely optional, and it might distract you from the cool demos we are planning to show, so perhaps you can try this as a recap after the actual skill learning session ;).

Here is a teaser of what the final stage of the tutorial will look like:

Win a lighthouse explorer bundle and a Hands-On Pass discount

We are also sponsors of the event and have agreed with The Construct to award one of the participants a Crazyflie if they win any contest. Specifically, we will be awarding a Lighthouse Explorer bundle, with a Qi deck and a custom-made charging pad similar to the ones we show at fairs like ICRA this year. So make sure to participate in the contests during the day for a chance to win this or any of the other prizes they have!

It is possible to follow the event for free, but if you’d like to participate with the ROSjects, you’ll need to get a hands-on pass. If you haven’t yet gotten a hands-on ticket for the Robotics Developer Day, please use our 50% off discount code:

19ACC2C9

This code is valid until the 2nd of July, 12 am (midnight) Central European Time! Buy your ticket on the event’s website: https://www.theconstruct.ai/robotics-developers-day/

RSS 2024 aerial swarm workshop

On a side note, we will be at the Robotics: Science and Systems Conference in Delft from July 15th to 19th, 2024—just about two weeks from now. We won’t have a booth as we usually do, but we will be co-organizing a half-day workshop titled Aerial Swarm Tools and Applications (more details on this website).

We will be organizing this workshop together with our collaborators at Crazyswarm2, as well as the developers of CrazyChoir and Aerostack2. We’re excited to showcase demos of these frameworks with a bunch of actual Crazyflies during the workshop, if the demo gods are on our side :D. We will also have great speakers, including: SiQi Zhou (TU Munich), Martin Saska (Czech Technical University), Sabine Hauert (University of Bristol), and Gábor Vásárhelyi (Collmot/Eötvös University).

Hope to see you there!

It’s been over a little year since we started the ROS Aerial Robotics community group together with the Drone Code Foundation, and it is still going strong (blogpost 1, blogpost 2). Since there is a nice mix of people joining the meetings from different backgrounds and drone operating systems, we have had quite a few discussions and overviews of various topics. For instance, we’ve explored courses in Aerial Robotics and other subjects in previous meetings. An important goal of the group has been to make it easier for people to get started with flying robotics, which we’ve achieved by collecting essential information in the ‘Aerial Robotic Landscape’.

Starting out in Aerial Robotics

Let’s cut to the chase: Aerial Robotics is a very challenging field to get started in. Not only do you need a comprehensive understanding of which hardware to acquire, but users also face a multiple choices. These decisions include selecting the right autopilot, simulator for testing ideas, and necessary sensors to achieve autonomy. Unlike the well-established Turtlebot in other robotics domains, there isn’t a universally accepted and field-tested getting-started development drone in the aerial robotics world. While we at Bitcraze would love everyone to go for the Crazyflie, we recognize its limitations. Like, it may not handle outdoor flights with GPS or carry heavy cameras effectively. Our goal, as the ‘Aerial Robotics Community group,’ is to make it easier for beginners by providing users with information about the hardware and software they truly need.

Drone Code Foundation and Bitcraze AB had a keynote speech together at ROSCon 2023 about getting started in Aerial Robotics called ‘Up, Up, and Away: Adventures in Aerial Robotics’. Please take a look at the talk here on Vimeo.

The Aerial Robotics Landscape website

The Aerial Robotics Landscape serves as a repository of information related to all things Aerial Robotics. It started out in the GitHub repository, and it grew due to the discussions held at the aerial robotics community group meetings. Additionally, contributions from both group members and external contributors have played an important role (you can explore the merged PRs).

As the pages and tables expanded, it became clear that a better representation was necessary than just the mere README documentation on the GitHub repository. The group therefore experimented with MKDocs, creating a website in the ‘Read the Docs’ theme. This is a similar theme that important packages within in the ROS ecosystem use, such as the ROS documentation, as well as ROS 2 packages like Nav2 and Crazyswarm2.

Please take a look at the rendered website here: https://ros-aerial.github.io/aerial_robotic_landscape/

Please contribute!

The Aerial Robotics Landscape is a dynamic , where development kits emerge while others are discontinued, new simulators rise while some remain unsupported, and autopilot and autonomy features evolve monthly. This ever-changing landscape demands constant updates and additions. We try to do this to the best of our ability, but we can’t do it alone — we need your help.

If you believe that your favorite hardware platform is missing from the landscape, or if you’ve recently developed a new planning algorithm for fixed-wing vehicles or created a YouTube course on optical-based flight, please contribute by means of a pull request to the GitHub repository. We’ve put together a guide on how to contribute to the Aerial Robotics Landscape here. Let’s make the website useful together!

If you’d like to join the ROS aerial Robotics meetings, please take a look at our community github repository for joining information. The next meeting is the 5th of June, 4 PM UTC and was announced on ROS discourse.