This week, some of us are on an adventure! Marcus and Tobias will be exploring both the RIG and Embedded World fairs.
RIG showcases the latest innovations in robotics and intelligent systems, while Embedded World is the place to be for cutting-edge embedded technologies. Both events promise amazing demos, insightful talks, and a chance to catch up with some of our collaborators.
Planning to attend either fair? Let’s meet up! We’d love to explore the exhibitions together, chat about cool technologies, or just geek out about the innovations on display. We’ll be wandering through Embedded World on Thursday and hitting RIG on Friday. Send us an email if you’d like to connect – we’re always up for grabbing coffee!
Next May in Atlanta
After our adventures as visitors, we’re thrilled to announce that we’ll be exhibiting at the International Conference on Robotics and Automation (ICRA) 2025! Stop by our booth where we’ll be showcasing our latest demo. We’ll be, as always, available to discuss our newest products, answer your technical questions, and provide insights into how our solutions can transform your robotics applications. We’re also eager to hear your thoughts on what you’d like to see in our upcoming products. Mark your calendars and make sure to find us at Booth #131 – we may even have some presentations in the work, but nothing confirmed yet.
Today in the shop
And, last but not least, the Brushless is now available in a Swarm configuration! Both the Lighthouse Swarm bundle and Loco Swarm bundle have been added to our shop. These new bundles feature all the same components as our standard Swarm packages, but come equipped with the Crazyflie 2.1 Brushless instead of the Crazyflie 2.1+ model.
Many of you may be familiar with the “Decentralized Swarm Demo” we have been showcasing at fairs. Today, we’re excited to introduce its upgraded version, the “Decentralized Brushless Swarm Demo”, that utilizes our latest products! Get ready for an even more impressive experience that takes swarm autonomy to the next level!
This demo utilizes the Lighthouse Positioning System for onboard positioning and peer-to-peer communication between the drones for collision avoidance. The fully autonomous takeoff/landing is achieved with the new Crazyflie 2.1 Brushless charging dock, while the bright LEDs under each Crazyflie are prototypes for a new high-power LED deck that is cooking in the Bitcraze pots at the moment.
You can appreciate the stability in this short video:
The Crazyflie 2.1 Brushless Swarm is much more robust and stable than the normal Crazyflie Swarm, as the new powerful motors allow each drone to perform quick maneuvers to avoid its neighbors. We also observed a much longer flight time than we had with Crazyflie 2.1+ – a full swarm of 9 copters could stay up for around 9 minutes before running out of batteries.
Using our own products—especially those in early access—is a crucial part of development. It allows us to encounter real-world issues that our users might face. In this case, we discovered that Lighthouse decks could be damaged during charging if pins protrude from the bottom of the Brushless. This can cause wiring issues with the dock.
If you’re using these components, please ensure that no pins are sticking out beneath the Brushless, or cover the pins at the bottom. We used the battery holder deck to avoid further issues.
The release of the Crazyflie 2.1 Brushless charging dock makes it possible for everyone to recreate demos like this so make sure to check it out at our store. The source code of the demo can be found on github at the crazyflie-firmware-experimental repository under the arena-demo branch.
We’re happy to announce that release 2025.02 is now available. This update includes fixes and improvements for the Crazyflie 2.1 Brushless, along with stability enhancements for the AI-deck.
As written in my previous blog post, Marcus and I have been visiting FOSDEM 25 at the beginning of February 2025 in Brussels, Belgium. This year we helped organize the first Fosdem Robotic and Simulation devroom!
Attending FOSDEM
I have been attending FOSDEM every year since 2015 (except 2020 and 2021 unfortunately …) and while it continues to grow year after year, it is still an awesome experience. It is a great place to talk to the people that make the software we use every day and to get new ideas.
This year we have had very good talk at different booth, most notably at KiCAD and Zephyr, since they are key to the work we do at Bitcraze. We could also attend a couple of really nice talks even though all the dev-room were full most of the time. And as usual we have also seen a lot of random booth and talk that sparks enough ideas to last a couple of years :).
Robotic and Simulation dev-room
Speaking of full dev-room. We helped organize the Robotic and Simulation dev-room this year. This is the first time there is a robotic dev-room at Fosdem.
The room was a success, it was full all-along and we have very nice talks. It was great to have a meeting point for robotics at Fosdem and we will definitely do it again next year!
If you missed it, most of the talks video are now available on the Fosdem website in the robotic and simulation dev-room page.
While planning for the Crazyflie 2.1 Brushless release we also decided to make our charging dock available to our users. We wanted our users to be able to make the same kind of demos we were making in our lab and showing off at fairs. To make this happen our 3D printer has been working around the clock the last couple of weeks, churning out as many charging docks as possible. And now we’re finally ready to put some in stock 🎉 So make sure to check out the Charging dock in the E-store if you want to keep your Crazyflie 2.1 Bushless ready to fly at all times!
The charging dock is the same version we use in our flight lab, you might have spotted it in previous videos (like this one from last week). It’s also the dock we will be using our for swarming demos at fairs (like this one) in the future. Compared to the Qi deck, using this solution we save a lot of weight as well as maintain the possibility of having downwards-facing decks (like the Flow) mounted.
Although the main usage is for swarming (with autonomous takeoff/landing) you can also use it as just a charging dock, placing it here each time you’ve done some flying in the lab. This ensures that your Crazyflie is prepared for the next round of experiments.
If you’re interested in seeing a bit of history, have a look at some of our older blog posts about the charger. From the first prototypes, passed a fancy version with LEDs and WiFi and finally ended in the currently more sleek version we have today.
It’s hard to believe it’s already been almost a month since the Crazyflie 2.1 Brushless was released. We know some of you have already had the chance to take it for a spin, and we’re really excited to hear what you think.
Here at the office, we have started using them a lot – to discover gaps in the documentation, to test our new features, or simply to make nice trajectories during a Fun Friday as shown here:
We’re constantly amazed by it and the new capacity it brings… But, interestingly, we haven’t received many support questions so far… which has us wondering—did we accidentally make it too good? Jokes aside, we’d love to get your thoughts! Whether you have feedback, questions, or just want to share your experience, we’re all ears.
We have a quick form for you here to fill out – it takes a couple of minutes and would help us a lot:
Marcus and I are going to visit FOSDEM 2025 at the end of the week. This is a great open-source conference that I visit every year but this year there is a twist: I am part of the organisation of the Robotics and Automation devroom! I am going to give the welcome talk there:
FOSDEM is a conference with many tracks, the main track and devrooms. Devrooms are like mini-conferences: they are handled by a committee that produces a call for participation and handles the schedule for the room. FOSDEM allocates a time slot, a physical room, and video recording for the devroom so that all talks are broadcasted in real-time and recorded.
Since my first visit to FOSDEM in 2015, we have been thinking about the lack of a dedicated devroom for robotics: a lot of robotics, at least in research, is open source. This is in part thanks to ROS, which allows for easily sharing modules and algorithms between projects, but it also applies to things like flight stacks that are often open-source. So we took it upon ourselves to organize what we wanted, a robotics-dedicated devroom.
We started last year, at FOSDEM 2024, by organizing a robotics Bird of Feather with Kimberly. These are impromptu meetups that can be organized by booking a time on the spot for a couple of dedicated rooms. There, we had some really nice discussions with fellow robotics enthusiasts and figured out that there was indeed quite some interest in robotics at FOSDEM and that we were enough interested parties to organize a devroom.
If you’re interested in open source and/or robotics and you can be in Brussels, Belgium, on the weekend of the 1st and 2nd of February 2025, please join us! The Robotics and Simulation devroom is on Sunday afternoon. I will also be monitoring our Mastodon channel more carefully, so do not hesitate to poke me if you want to meet either me or Marcus, as we will be at the conference both days.
Hello there! I am Aris, a recent graduate from the Mechanical Engineering and Aeronautics Department at the University of Patras, Greece.
For the last couple of years, as a member of the Robotics Group at the University of Patras, I have been working with Bitcraze products, including the Crazyflie 2.1 and the Lighthouse Positioning System, exploring swarming scenarios. In my diploma thesis I investigated the autonomous transportation of an object by a swarm of two rope-tethered quadcopters, where the introduced path planning algorithm enables the swarm to approach, grasp, lift and transport the load.
The swarm grasping the load. Trajectories Simulation
I’m excited to join Bitcraze as part of the team, where I can further develop my skills and gain valuable work experience. My principal objective is to improve the Lighthouse Positioning System and contribute to the development of a new solution for tracking in larger volumes. During this time, I look forward to deepen my knowledge on Bitcraze’s products and learn how they apply in real-world applications.
After a busy fall of testing and fine-tuning, we’re thrilled to announce that the Brushless is now available! Our team has put in a lot of effort to ensure it meets our high standards, and we can’t wait for you to experience it.
If you’re curious to see it in action, we’ve featured the Brushless in our recent Christmas video, where it showcases its capabilities by navigating through Christmas obstacles with precision.
For those interested in its application in research, our latest blog post demonstrates how the Brushless can be used in academic settings. It’s exciting to see the potential it holds for various fields!
If you need anything to keep your Brushless flying, all spare parts are already stocked in our store. Additionally, many of our bundles now offer Brushless versions, providing more options to suit your needs.
We’re eager to hear your thoughts and feedback as you explore the capabilities of our latest drone. Your insights are invaluable to us and help drive our continuous improvement.
We look forward to seeing what you’ll achieve with the Brushless!
Robotics and Simulation at FOSDEM 25
Arnaud will be at Fosdem the 1st and 2nd of February 2025 in Brussels, Belgium. He’s actually hosting with Kimberly the robotics and simulation dev-room! If you’re in Brussels, we’ll be happy to meet you.
Drones can perform a wide range of interesting tasks, from crop inspection to search-and-rescue. However, to make drones practically attractive they should be safe and cheap. Drones can be made safer by reducing their size and weight. This causes less damage in a collision with people or the environment. Additionally, being cheap means that the drones can take more risk – as it is less expensive to lose one – or that they can be deployed in larger numbers.
To function autonomously, such a drone should at least have some basic navigation capabilities. External position references such as GPS or UWB beacons can provide these, but such a reference is not always available. GPS is not accurate enough in indoor settings, and beacons require prior access to the area of operation and also add an additional cost.
Without these references, navigation becomes tricky. The typical solution is to have the drone construct a map of its local environment, which it can then use to determine its position and trajectories towards important places. But on tiny drones, the on-board computational resources are often too limited to construct such a map. How, then, can these tiny drones navigate? A subquestion of this – how to follow previously traversed routes – was the topic of my MSc thesis under supervision of Kimberly McGuire and Guido de Croon at TU Delft, and my PhD studies. The solution has recently been published in Science Robotics – “Visual route following for tiny autonomous robots” (TU Delft mirror here).
Route following
In an ideal world, route following can be performed entirely by odometry: the measurement and recording of one’s own movements. If a drone would measure the distance and direction it traveled, it could just perform the same movements in reverse and end up at its starting place. In reality, however, this does not entirely work. While current-day movement sensors such as the Flow deck are certainly accurate, they are not perfect. Every time a measurement is taken, this includes a small error. And in order to traverse longer distances, multiple measurements are summed, which causes the error to grow impractically large. It is this integration of errors that stops drones from using odometry over longer distances.
The trick to traveling longer distances, is to prevent this buildup of errors. To do so, we propose to let the drone perform ‘visual homing’ maneuvers. Visual homing is a control strategy that lets an agent return to a location where it has previously taken a picture, called a ‘snapshot’. In order to find its way back, the agent compares its current view of the environment to the snapshot that it took earlier. The trick here is that the difference between these two images smoothly grows with distance. Conversely, if the agent can find the direction in which this difference decreases, it can follow this direction to converge back to the snapshot’s original location.
The difference between images smoothly increases with their distance.
So, to perform long-distance route following, we now command the drone to take snapshots along the way, in addition to odometry measurements. Then, when retracing the route, the drone will routinely perform visual homing maneuvers to align itself with these snapshots. Because the error after a homing maneuver is bounded, there is now no longer a growing deviation from the intended path! This means that long-range route following is now possible without excessive drift.
Implementation
The above mentioned article describes the strategy in more detail. Rather than repeat what is already written, I would like to give a bit more detail on how the strategy was implemented, as this is probably more relevant for other Crazyflie users.
The main difference between our drone and an out-of-the-box one, is that our drone needs to carry a camera for navigation. Not just any camera, but the method under investigation requires a panoramic camera so that the drone can see in all directions. For this, we bought a Kogeto Dot 360. This is a cheap aftermarket lens for an older iPhone that provides exactly the field-of-view that we need. After a bit of dremeling and taping, it is also suitable for drones.
ARDrone 2 with panoramic camera lens.
The very first visual homing experiments were performed on an ARDrone 2. The drone already had a bottom camera, to which we fitted the lens. Using this setup, the drone could successfully navigate back to the snapshot’s location. However, the ARDrone 2 hardly qualifies as small as it is approximately 50cm wide, weighs 400 grams and carries a Linux computer.
Eachine Trashcan with panoramic camera and Flow deck.
To prove that the navigation method would indeed work on tiny drones, the setup was downsized to a Crazyflie 2.0. While this drone could take off with the camera assembly, it would become unstable very soon as the battery level decreased. The camera was just a bit too heavy. Another attempt was made on an Eachine Trashcan, heavily modified to support both the camera, a flowdeck and custom autopilot firmware. While this drone had more than enough lift, the overall reliability of the platform never became good enough to perform full flight experiments.
After discussing the above issues, I was very kindly offered a prototype of the Crazyflie Brushless to see if it would help with my experiments. And it did! The Crazyflie brushless has more lift than the regular platform and could maintain a stable attitude and height while carrying the camera assembly, all this, with a reasonable flight time. Software-wise it works pretty much the same as the regular Crazyflie, so it was a pleasure to work with. This drone became the one we used for our final experiments, and was even featured on the cover of the Science Robotics issue.
Crazyflie Brushless prototype with panoramic camera.
With the hardware finished, the next step was to implement the software. Unlike the ARDrone 2 which had a full Linux system with reasonable memory and computing power, the Crazyflie only has an STM32 microcontroller that’s also tasked with the flying of the drone (plus an nRF SoC, but that is out of scope here). The camera board developed for this drone features an additional STM32. This microcontroller performed most of the image processing and visual homing tasks at a framerate of a few Hertz. However, the resulting guidance also has to be followed, and this part is more relevant for other Crazyflie users.
To provide custom behavior on the Crazyflie, I used the app layer of the autopilot. The app layer allows users to create custom code for the autopilot, while keeping it mostly decoupled from the underlying firmware. The out-of-tree setup makes it easier to use a version control system for only the custom code, and also means that it is not as tied to a specific firmware version as an in-tree process.
The custom app performs a small number of crucial tasks. Firstly, it is responsible for communication with the camera. Communication with the camera was performed over UART, as this was already implemented in the camera software and this bus was not used for other purposes on the Crazyflie. Over this bus, the autopilot could receive visual guidance for the camera and send basic commands, such as the starting and stopping of image captures. Pprzlink was used as the UART protocol, which was a leftover from the earlier ARDrone 2 and Trashcan prototypes.
The second major task of the app is to make the drone follow the visual guidance. This consisted of two parts. Firstly, the drone should be able to follow visual homing vectors. This was achieved using the Commander Framework, part of the Stabilizer Module. Once the custom app was started, it would enter an infinite loop which ran at a rate of 10 Hertz. After takeoff, the app repeatedly calls commanderSetSetpoint to set absolute position targets, which are found by adding the latest homing vector to the current position estimate. The regular autopilot then takes care of the low-level control that steers the drone to these coordinates.
The core idea of our navigation strategy is that the drone can correct its position estimate after arriving at a snapshot. So secondly, the drone should be able to overwrite its position estimate with the one provided by the route-following algorithm. To simplify the integration with the existing state estimator, this update was implemented as an additional position sensor – similar to an external positioning system. Once the drone had converged to a snapshot, it would enqueue the snapshot’s remembered coordinates as a position measurement with a very small standard deviation, thereby essentially overwriting the position estimate but without needing to modify the estimator. The same trick was also used to correct heading drift.
The final task of the app was to make the drone controllable from a ground station. After some initial experiments, it was determined that fully autonomous flight during the experiments would be the easiest to implement and use. To this end, the drone needed to be able to follow more complex procedures and to communicate with a ground station.
Because the cfclient provides most of the necessary functions, it was used as the basis for the ground station. However, the experiments required extra controls that were of course not part of a generic client. While it was possible to modify the cfclient, an easier solution was offered by the integrated ZMQ server. This server allows external programs to communicate with the stock cfclient over a tcp connection. Among the possibilities, this allows external programs to send control values and parameters to the drone. Since the drone would be flying autonomously and therefore low-frequencies would suffice, the choice was made to let the ground station set parameters provided by the custom app. To simplify usability, a simple GUI was made in python using the CFZmq library and Tkinter. The GUI would request foreground priority such that it would be shown on top of the regular client, making it easy to use both at the same time.
Cfclient with experimental overlay (bottom right).
To perform more complex experiments, each experiment was implemented as a state machine in the custom app. Using the (High-level) Commander Framework and the navigation routines described above, the drone was able to perform entire experiments from take-off to landing.
Using the hardware and software described above, we were able to perform the route-following experiments. The drone was commanded to fly a preprogrammed trajectory using the Flow deck, while recording odometry and snapshot images. Then, the drone was commanded to follow the same route in reverse, by traveling short sections using dead reckoning and then using visual homing to correct the incurred drift.
As shown in the article, the error with respect to the recorded route remained bounded. Therefore, we can now travel long routes without having to worry about drift, even under strict hardware limitations. This is a great improvement to the autonomy of tiny robots.
I hope that this post has given a bit more insight into the implementation behind this study, a part that is not often highlighted but very interesting for everyone working in this field.