Finalizing the integration of the Crazyflie 2.1 Brushless into our software ecosystem and expanding its documentation were key steps in preparing for its launch. These efforts ensure compatibility, improve the user experience, and make the platform more accessible to the community. We’re looking forward to a smooth launch and to seeing how the community will utilize the new platform!
This year, we introduced updates to the Crazyflie 2.1 kit, making the 47-17 propellers the new default and including an improved battery. These upgrades enhance flight performance and endurance, culminating in the release of the Crazyflie 2.1+—an optimized iteration of our established platform.
And don’t forget the developer meetings, where we shared some more behind the scenes information and collected invaluable feedback from the community.
We also released a new edition of our research compilation video, showcasing some of the coolest projects from 2023 and 2024 that highlight the versatility and impact of the Crazyflie platform in research.
Team
In the past year, Bitcraze saw significant changes within the team. in February, Rik rejoined the team. Tove started at Bitcraze in April. Mandy, with whom we’ve already worked extensively over the years, joined as our production representative in Shenzen. At the end of the year, we said goodbye to Kimberly, whose contributions will be deeply missed. Additionally, we had Björn with us for a few months, working on his master’s thesis on fault detection, and Joe continued his industrial postdoc at Bitcraze that began in December 2023. Looking ahead, Bitcraze is hiring for two new roles: a Technical Sales Lead and a Technical Success Engineer, to support our ongoing projects and customer collaborations.
Midsummer lunch with the teamChristmas-themed Bitcraze office
As we close the chapter on 2024, we’re proud of the progress we’ve made, the connections we’ve strengthened, and the milestones we’ve reached. With exciting launches, new faces on the team, and continued collaboration with our community, we’re ready to soar to even greater heights in 2025. Thank you for being part of our journey!
Hi everyone! I have a bit of news to share… I’ve decided to leave Bitcraze at the end of 2024. But not before I share with you my latest Fun Friday project that I’ve tried my best to finish up before I leave before my Christmas holiday in December.
Frankensteining the Pololu Robot with the Crazyflie Bolt
During the ROSCon talk about the lighthouse system (see the recording here), I’ve already shown a small example of how the lighthouse system could be used on other robots as well. Here you see a Pololu RPI 2040 (the hyper edition of course), with a slimmed down Crazyflie Bolt and a Lighthouse deck. The UART2 port on the Bolt (pinout is the same as Crazyflie) is interfacing with the UART0 connection on the Pololu (pinout). Then the Pololu’s 3v3 is connected to the vUSB and GND to GND (obviously), so 4 wires in total. Technically, the 3v3 port is not supplying enough power for the Crazyflie on paper, but it seemed to be enough as long as the Crazyflie Bolt doesn’t have motors connected it should be fine. But if anyone would like to do a driving-flying hybrid with this combo, you might need to check the specifications a bit closer. For now, just ignore the red low-battery LED on the Bolt, but if you see it restarting then perhaps give the Pololu a fresh set of batteries.
Since the Pololu RPI 2040 doesn’t have any wireless communication, this can be done through the Crazyflie Bolt and the Crazyradio. I’ve made an app layer variant for the Bolt to forward state estimates and velocity commands; however, it did require a bit of an extra logging variable in the firmware itself. But this allows me to control the Pololu through the CFclient! Since it’s using velocity commands, this means that the mobile app is out though, but perhaps if anyone is interested in getting this rolling, let me know. Also, the screen shows the current X, Y, Z, and yaw estimate of the Bolt transferred to the Pololu with the commands that I’ve given it.
I’d like to have connected this to a differential drive controller to make use of the position setpoints, but unfortunately the AA batteries ran out at the office and I was unable to complete this by the last day. It would have been great to use the Lighthouse positioning for this. Perhaps in the next coming months, I can try to continue with it and have my cats chase an autonomous robot around the house, who knows! If anyone is interested in playing around with this, these are the repositories/branches for both the Bolt and the Pololu:
First of all, I’ll take a long holiday in the US, first visiting New York (first time) before I hop over to Tulsa and Santa Barbara to visit family. Early 2025 I’ll be taking a long break, or a mini sabbatical of sorts, where I plan to work on some personal projects but mostly have a breather. I haven’t had a break like this in over 15 years, and given a tough 2023, I can definitely say that I’ve deserved some time off. What will happen after, I will hopefully figure out then, but for sure I will be continuing to co-lead the Aerial Robotics Interest Group at ROS and helping out in support of the Crazyswarm2 project.
I’d like to thank my colleagues at Bitcraze for an amazing 5 years here in Malmö, Sweden, and everyone that I was able to meet through them. I’ve learned a lot in terms of joint software development, code maintenance, community interaction, and, most importantly, having fun during work. I also will never forget the support I received while I was going through cancer treatment, and for that I’m very grateful. I wish you all the best and I hope the Crazyflie continues to thrive, saving more PhD projects as it did mine. Thank you.
It’s been a while since I last talked about hiring! We successfully onboarded our most recent recruit, and now it’s time to start planning for the future.
One of our challenges as a team is that we’re very heavy on engineers and developers. While that’s fantastic for building products, it means we lack expertise in other important areas. That’s why we’re now shifting our focus to bringing in talent to help fill those gaps. We’ve partnered with a recruitment agency once again to help us find the right people for the job. We’re currently hiring for two distinct roles—here’s what we’re looking for!
Technical sales lead
You will be responsible for developing and implementing sales strategies while exploring both new and existing markets. You’ll take the lead in driving sales and acquiring new customers, becoming the company’s go-to expert on marketing and sales tactics. Your day-to-day tasks will include supporting business development, optimizing sales processes, and proposing effective marketing strategies. This role is perfect for someone with a background in technical sales with a strong strategic mindset and a sense of responsibility.
We’re looking for a Technical Success Engineer to provide our customers with technical guidance and product expertise. This role involves offering first-line support, creating documentation and tutorials, and assisting with tech-focused sales efforts. The goal is to ensure a smooth and seamless customer experience while building strong client relationships. It’s an ideal position for a “social developer”—someone with a solid technical background who also excels in communication and enjoys engaging with others.
Both positions are full-time and based at our office in Malmö, Sweden. If you’re curious about why you should join our team, I’ve already shared some of the many reasons why I love being part of Bitcraze.
If you’re interested or have any questions, please send an email to fredric.vernqvist@techtalents.se or contact us at contact@bitcraze.se.
Today, we’re excited to share research from Vrije Universiteit Amsterdam, ‘From Shadows to Light,’ which presents an innovative swarm robotics approach where nano-drones autonomously track dynamic sources indoors.
Motivation
In dynamic and unpredictable indoor environments, locating moving sources—such as heat, gas, or light—presents unique challenges. GPS-denied settings, in particular, demand innovative and efficient onboard solutions for both control and sensing. Our research demonstrates how small drones, like Crazyflies, can be organized into a coordinated swarm to autonomously locate and follow these sources indoors, relying solely on onboard sensing and communication capabilities. Without sharing individual measurements, each drone adapts its behavior in response to its own sensor readings, allowing the swarm to collectively converge on the center of a light source through modified interactions with nearby agents.
Tugay Alperen (right) and Victor Retamal (left) during ICRA 2024 poster session
Method
Our approach enables each Crazyflie to function autonomously, using onboard sensing combined with continuous inter-agent communication at a frequency of 20 Hz. This methodology is structured around three core components:
Proximal Control and Collective Motion
Each drone broadcasts its position to nearby agents, enabling the calculation of relative positions to maintain safe distances. This proximal control ensures cohesive group movement by computing virtual force vectors for velocity commands, which are sent to onboard controllers operating at 20 Hz.
Source Seeking Through Adaptive Social Proximity
Drones use custom light sensors to detect local light intensity. Instead of directly adjusting positions based on this measurement, each drone modifies its social proximity to neighbors according to the sensed intensity without broadcasting this information. This adaptation allows the swarm to collectively follow the light gradient toward the source in a decentralized manner.
Obstacle Avoidance
Equipped with time-of-flight sensors, each drone independently detects obstacles and adjusts its trajectory to maintain safety. This ensures the swarm remains intact while navigating toward the source.
By combining continuous relative positioning, virtual force-based control, individual sensing, and adaptive social behavior, our methodology provides a robust framework for efficient source seeking in GPS-denied indoor environments.
Experimental Setup
Crazyflie equipped with Flow Deck v2, UWB Deck, Multi-Ranger Deck, and a custom-made deck that produces an analog voltage reading from an LDR for light intensity measurements.
The system architecture allowing us to achieve autonomous flocking and source localization with a swarm of Crazyflie
Our experiments take place in a 7×4.75-meter indoor arena with remotely controlled overhead light bulbs. These bulbs, activated individually or in pairs, create a moving light gradient. We tested our flocking swarm by initially positioning them at the edge of an illuminated area. As the light source shifted, we assessed the swarm’s performance by comparing their trajectories with the known centers of the illuminated areas without waiting for full convergence at each step. We also mapped our environment’s light intensity by moving a single Crazyflie randomly around the flight arena and recording the measurements to later merge on a single map to generate this light intensity heatmap.
The brightness values around the test environments, measured for each light source when only it was active.
Results
The flock flies as an ordered swarm, successfully localizing around the source with the swarm’s centroid positioned at the source center. (The centroid appears as a point without an arrow in the video.)
Even with an obstacle present within or between the illuminated regions, the flock successfully localizes around the center, avoiding the obstacle and maintaining order and cohesion within the swarm. The Multi-Ranger deck provides distance measurements for obstacle detection.
Future Directions
As the next step, we plan to apply our highly generalizable algorithm to various source types, including gas sources, radio signals, and similar sources that provide only scalar strength measurements rather than directional cues. Additionally, we have demonstrated that our flocking and source localization algorithms work effectively in 3D. We aim to showcase a fully functional application with a 3D-localized source and a flocking swarm operating in 3D space. Finally, we are working toward achieving fully onboard relative localization, which would eliminate the need for any indoor positioning system. This advancement would allow our swarm to operate autonomously in any environment, replicating the same behavior wherever it is deployed.
The authors were with the Vrije Universiteit Amsterdam.
Please feel free to contact us with any questions or ideas: t.a.karaguzel@vu.nl
Please cite this as:
@ARTICLE{10314746,
author={Karagüzel, Tugay Alperen and Retamal, Victor and Cambier, Nicolas and Ferrante, Eliseo},
journal={IEEE Robotics and Automation Letters},
title={From Shadows to Light: A Swarm Robotics Approach With Onboard Control for Seeking Dynamic Sources in Constrained Environments},
year={2024},
volume={9},
number={1},
pages={127-134},
keywords={Robot sensing systems;Autonomous aerial vehicles;Position measurement;Vehicle dynamics;Sensors;Location awareness;Drones;Swarm robotics;aerial systems: perception and autonomy;multi-robot systems},
doi={10.1109/LRA.2023.3331897}}
It’s been 2 weeks ago that we went to ROSCon ’24 in Odense Denmark as exhibitor and silver sponsor! Since it was a 2-hour train ride for us, it made much sense for us to attend this as a company and we are very happy we did. In this blog post we are sharing our experiences of the event.
The Booth Build-up and Demo
We made some changes to our well-known cage that is a must at every conference we have exhibited. Usually, it would take us a good few hours just to set up the cage alone, but we have improved the corners which improved our build-up experience quite a lot and we were done within an hour! Just in time for us to join the tours and bird of feather sessions with no stress!
All done before 11 am!
For ROSCon we prepared a more ROSflavored demo that enabled full demo control from ROS, which was based on the swarming mapping demo shown in this tutorial and the robotics developer day (see this video). Here we already hit a couple of issues that all had to do with the differences between demos for exhibitions versus one-time talk demos (see OpenCV! Live episode where we talked about demo driven development). We switched back to our usual fully decentralized autonomous swarm demo (see this blog post). Luckily, the Crazyflie could still communicate at the same time to give through the multiranger values, such that the computer could still generate the Swarm merging map while the Crazyflies were flying around avoiding each other.
Exhibition Booth
Tuesday and Wednesday were the actually exhibition days so that is when we talked with most of the people. It was a bit slow in the beginning as we were located at the end of the hall, but luckily the ROSCon passport game motivated people to go by each of the booth to get a stamp. We went a bit rogue and made our own much bigger stamp ;) but luckily it still fit as long as we aligned properly. We donated a STEM Ranging bundle as one of the prizes to congratulate whoever won this! And now they can try out this ROS tutorial ;)
Talking to people outside and inside the cage
We noticed that the Crazyflie Brushless got a lot of attention. The ability to carry more than a regular Crazyflie seemed of great interest to many of the ROSCon attendees. Moreover, the prototype of the forward-facing expansion connector (a.k.a. the Camera deck) was also a well-requested feature of the Crazyflie and has solidified our belief that the community needs something like this as well. In general, the lighthouse positioning system and the stand-alone lighthouse node were also quite well received. Luckily we were able to forward people to our accepted talk about the Lighthouse position system on Thursday
Lighthouse Positioning Talk
One of the reasons we were present at ROSCon 2024 was to gauge the interest of the general robotics community in the lighthouse positioning system. We have been using it for years for the Crazyflies, but we’d like to also evangelize its submillimeter and cost-effective awesomeness for any other platform. And there seems to be quite some interest for it! We gave a short presentation on Thursday Afternoon during the ‘ROS Tooling & Testing’ session (we will share the recording once it becomes available).
Talk about Lighthouse Positioning – Taken by Dharini Dutia from Women in Robotics
We also send out some polls just to see what kind of positioning systems are used and for what purpose. It was evident that there are many outdoor roboticists that also use onboard-sensing based state-estimation like SLAM, but there was still a significant portion of people that used indoor positioning systems for the actual positioning replacement and/or Ground truth. And also we got some valuable feedback, like if it would still work out with a Lidar or Kinect, or if it is suitable for a 12-meter size robot (wow). We will take this all in for improvements for any new upgrades to the lighthouse deck and stand-alone nodes for it. Thanks to you all for providing all the feedback and the interest!
Side-events
We also attended a couple of events related to ROSCon 2024. Marcus and Kimberly both attended tours of Odense Robotics, Universal robots and Teradyne facilities. The tour of the SDU Drone Center was particularly impressive. Moreover, we also attended the Aerial Robotics Meetup, who attracted about 90-100 people at the max, with drinks and snacks provided by Dronecode Foundation. It was great to see such a big aerial presence at ROSCon. There was also the Karaoke meetup, the ROSCon afterparty by Odense Robotics with a beer-serving robot arm, the Women in Robotics lunch… there was just too much to attend to but it all was a great success!
Check out the ROSCon 2024 event page on our website of what we have shown at ROSCon 2024 and see more information about the demos/products we had there.
We are excited to announce that we are working on several new link performance metrics for the Crazyflie that will simplify the troubleshooting of communication issues. Until now, users have had access to very limited information about communication links, relying primarily on a “link quality” statistic based on packet retries (when we have to re-send data) and an RSSI channel scan. Our nightly tests have been limited to basic bandwidth and latency testing. With this update, we aim to expose richer data that not only enables users to make more informed decisions regarding communication links but also enhances the effectiveness of our nightly testing process. In this blog post, we will explore the new metrics, the rationale behind their introduction, and how they will improve your interaction with the Crazyflie. Additionally, we will be holding a developer meeting on Wednesday November 13th to discuss these updates in more detail, and we encourage you to join us!
“Link Quality”—All or Nothing
Until now, users of the Crazyflie have had access to a single link quality metric. Implemented in the Python library, this metric is based on packet retries—instances when data packets need to be re-sent due to communication issues. This metric indicates that for every retry, the link quality drops by 10%, with a maximum of 3 retries allowed. As a result, the link quality score usually ranges from 70% to 100%, with a drop to 0% when communication is completely lost. However, as packet loss occurs, users often experience a steep decline, commonly seeing 100% when packets are successfully acknowledged or dropping to 0% when communication is completely lost.
Client representation of link quality; no link, yes link
The current link quality metric has served as a basic indicator but provides limited insight, often making it difficult to gauge communication reliability accurately. Recognizing these limitations, we’re introducing several new link performance metrics to the Crazyflie Python library, designed to provide a far more detailed and actionable view of communication performance.
What’s Coming in the Upcoming Update
The first metric we are adding is latency. We measure the full link latency, capturing the round-trip time through the library, to the Crazyflie, and back. This latency measurement is link-independent, meaning it applies to both radio and USB connections. The latency metric exposed to users will reflect the 95th percentile—a commonly used measure for capturing typical latency under normal conditions.
Next are several metrics that (currently) only support the radio link. For these, we distinguish between uplink (from the radio to the Crazyflie) and downlink (from the Crazyflie to the radio).
The first is packet rate, which simply measures the number of packets sent and received per second.
More interestingly, we are introducing a link congestion metric. Whenever there is no data to send, both the radio and the Crazyflie send “null” packets. By calculating the ratio of null packets to the total packets sent or received, we can estimate congestion. This is particularly useful for users who rely heavily on logging parameters or, for example, stream mocap positioning data to the Crazyflie.
The Received Signal Strength Indicator (RSSI) measures the quality of signal reception. Unlike our current “link quality” metric, we hope that a poor RSSI will serve as an early warning signal for potential communication loss. While RSSI tracking has been possible before with the channel scan example, this update will monitor RSSI in the library by default, and expose it to the user. The nRF firmware will also be updated to report RSSI by default. Currently, we only receive uplink RSSI, that is, RSSI measured on the Crazyflie side.
Work in progress client representation of new link performance metrics
We’ve already found these new metrics invaluable at Bitcraze. While we have, of course, measured various parameters throughout development, it was easy to lose track of the precise status of the communication stack. In the past, we relied more on general impressions of performance, but with these new metrics, we’ve gained a clearer picture. They’ve already shed light on areas like swarm latency, helping us fine-tune and understand performance far better than before.
You can follow progress on GitHub, and we invite you to try out these metrics for yourself. If there’s anything you feel is missing, or if you have feedback on what would make these tools even more helpful, we’d love to hear from you. Hit us up over on GitHub or join the developer meeting on Wednesday the 13th of November (see the join information on discussions).
We are happy to announce that release 2024.10 is now available! Special thanks to our community contributors for their valuable input and code contributions in this release!
We have some very busy weeks behind us and ahead! As we are working hard on releasing the new CF Brushless, we have been preparing for the upcoming ROSCon in Odense Denmark next week (see this previous blogpost) and we also featured on the latest OpenCV live episode as well! So more about both in this blogpost.
OpenCV Live! Demo Driven Development
We were featured as guests on the latest OpenCV Live! episode hosted by Phil Nelson and Satya Mallick, where we went through a bit of the history of the start of Bitcraze and all of the (crazy) demos done with the Crazyflie in the last decade. We have done a similar topic for our latest developer meeting, but for this episode we put the focus more on vision based demos, since OpenCV has been definitely used in the past at Bitcraze for various reasons! Just type in OpenCV in the top right search barto check out any of the blogs we have written.
During the OpenCV live episode of the 10th of October, Arnaud and Kimberly told the backstories of these demos that went from a manual flight fail where Arnaud flew the Crazyflie 1.0 in Marcus’ hair, using OpenCV and Aruco markers for positioning to flying a swarm in your kitchen. It was really fun to do and alos one lucky listener managed to answer the two questions the host Phil asked at the end, namely “Where does the name Crazyflie come from?” and “Why is the last part (‘-flie’) spelled this way?” and won a STEM ranging bundle. If you’d like to know the answers, go and watch the latest OpenCV! Live episode ;) Enjoy!
ROSCon – What to expect?
So next week we will be present as Silver Sponsor at ROSCon Odense, namely on Monday 21th and Wednesday 23rd of October. The Bitcraze booth will be located on number 21 so that should be near the coffee break place! We will have are old trusty cage with some upgrades with a nice ROS demo which is similar to the one explained in this Crazyflie ROS tutorial we have written a while ago, but then the swarming variant of it. We also hope to show a Brushless Crazyflie Prototype, and a new camera deck prototype, along with anything else we can find lying around at our office :D.
Moreover, Arnaud will be given a presentation on the lighthouse positioning system, namely at Wednesday 23rd of October 14:40 (2:30 pm) called ‘The Lighthouse project: from Virtual Reality to Onboard Positioning for Robotics’. The lighthouse positioning system will also be the system that we will demo at our booth so if you’d like to see it for yourself, or perhaps (during downtime) hack around together with us, you are more than welcome to do so! Check out the Bitcraze ROSCon Eventpage for more details about our demo or the hardware we will show.
It’s now become a tradition to create a video compilation showcasing the most visually stunning research projects that feature the Crazyflie. Since our last update, so many incredible things have happened that we felt it was high time to share a fresh collection.
As always, the toughest part of creating these videos is selecting which projects to highlight. There are so many fantastic Crazyflie videos out there that if we included them all, the final compilation would last for hours! If you’re interested, you can find a more extensive list of our products used in research here.
The video covers 2023 and 2024 so far. We were once again amazed by the incredible things the community has accomplished with the Crazyflie. In the selection, you can see the broad range of research subjects the Crazyflie can be a part of. It has been used in mapping, or swarms – even in heterogeneous swarms! With its small size, it has also been picked for human-robot interaction projects (including our very own Joseph La Delfa showcasing his work). And it’s even been turned into a hopping quadcopter!
Here is a list of all the research that has been included in the video:
Energy efficient perching and takeoff of a miniature rotorcraft Yi-Hsuan Hsiao, Songnan Bai, Yongsen Zhou, Huaiyuan Jia, Runze Ding, Yufeng Chen, Zuankai Wang, Pakpong Chirarattananon City University of Hong Kong, Massachusetts Institute of Technology, The Hong Kong Polytechnic University
But enough talking, the best way to show you everything is to actually watch the video:
A huge thank you to all the researchers we reached out to and who agreed to showcase their work! We’re especially grateful for the incredible footage you shared with us—some of it was new to us, and it truly adds to the richness of the compilation. Your contributions help highlight the fantastic innovations happening within the Crazyflie community. Let’s hope the next compilation also shows projects with the Brushless!
You might remember that at the beginning of this summer, we were invited to do a skill-learning session with the Crazyflie at the Robotics Developer Day 2024 (see this blog post) organized by The Construct. We showed the Crazyflie flying with the multi-ranger deck, capable of mapping the room in both simulation and the real world. Moreover, we demonstrated this with both manual control and autonomous wall-following. Since then, we wanted to make some improvements to the simulation. We now present an updated tutorial on how to do all of this yourself on your own machine.
This tutorial will focus on using the multi-ranger ROS 2 nodes for both mapping and wall-following in simulation first, before trying it out on the real thing. You will be able to tune settings to your specific environment in simulation first and then use exactly the same nodes in the real world. That is one of the main strengths of ROS, providing you with that flexibility.
We have made a video of what to expect of the tutorial, for which you should use this blogpost for the more detailed instructions.
Watch this video first and then again with the instructions below
What do you need first?
You’ll need to setup some things first on the PC and acquire hardware to follow this tutorial in full:
Gazebo Harmonic – Install via these instructions This is not the recommended Gazebo for humble but we will install the specific ROS bridge for this later. Just make sure that you don’t have gazebo classic installed on your machine.
Hardware
You’ll need to components at least of the STEM ranging bundle
If you have any different setup of your computer or positioning system, it is okay as the demos should be simple enough to work, but, be prepared for some warning/error handling that this tutorial might have not covered.
Time to complete:
This is an approximation of how much time you need to complete this tutorial, depended on your skill level, but if you already have experience with both ROS 2/Gazebo and the Crazyflie it should take 1 hour.
If you have the Crazyflie for the first time, it would probably be a good idea to go through the getting started tutorial and connect to it with a CFclient with the Flowdeck and Multi-ranger deck attached as a sanity check if everything is working before jumping into ROS 2 and Gazebo.
Go to the ros2_ws workspace and build the packages
cd ~/crazyflie_mapping_demo/ros2_ws/
source /opt/ros/humble/setup.bash
colcon build --cmake-args -DBUILD_TESTING=ONCode language:JavaScript(javascript)
Building will take a few minutes. Especially Crazyswarm2 will show a lot of warnings and std_err, but unless the package build has ‘failed’, just ignore it for now until we have proposed a fix to that repository.
If the build of all the packages passes and non failed, please continue to the next step!
2. Simple mapping simulation
This section will explain how to create a simple 2D map of your environment using the multi-ranger. The ROS 2 package designed for this is specifically made for the multi-ranger, but it should be compatible with NAV2 if you’d like. However, for now, we’ll focus on a simple version without any localization inferred from the map.
Open up a terminal which needs to be sourced for both the gazebo model and the newly build ROS 2 packages:
If you get a ‘No such file or directory’ error on the model, try entering the full path in GZ_SIM_RESOURCE_PATH export.
Gazebo will start with the Crazyflie in the center. You can get a close-up of the Crazyflie by right-clicking it in the Entity tree and pressing ‘Move to’. You can also choose to follow it, but the camera tracking feature of Gazebo needs some tuning to track something as small as the Crazyflie. Additionally, you will see RVIZ starting with the map view and transforms preconfigured.
Open up another terminal, source the installed ROS 2 distro and open up the ROS 2 teleop keyboard node:
source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard
Have the Crazyflie take off with ‘t’ on your keyboard, and rotate it around with the teleop instructions. In RVIZ you should see the map being created and the transform of the Crazyflie moving. You should be able to see this picture, and in this part of the video.
Screenshot of the Crazyflie in Gazebo generating a map with Teleop (video)
3. Simple mapping real world
Now that you got the gist of it, let’s move to the real Crazyflie!
First, if you have a different URI of the Crazyflie to connect to, first change the config file ‘crazyflie_real_crazyswarm2.yaml’ in the crazyflie_ros2_repository. This is a file that Crazyswarm2 uses to know to which Crazyflie to connect to.
Open up the config file in gedit or your favorite IDE like visual code:
and change the URI on this line specifically to the URI of your Crazyflie if necessary. Mind that you need to rebuild ros2_ws again to make sure that this has an effect.
Now source the terminal with the installed ROS 2 packages and the Gazebo model, and launch the ROS launch of the simple mapper example for the real world Crazyflie.
Now open up another terminal, source ROS 2 and open up teleop:
source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard
Same thing, have the Crazyflie take off with ‘t’, and control it with the instructions.
You should be able to see this on your screen, which you can also check with this part of the video.
Screen shot of the real Crazyflie mapping while being controlled with ROS 2 teleop (video)
Make the Crazyflie land again with ‘b’, and now you can close the ROS 2 node in the launch terminal with ctrl + c.
4. Wall following simulation
Previously, you needed to control the Crazyflie yourself to create the map, but what if you could let the Crazyflie do it on its own? The `crazyflie_ros2_multiranger` package includes a `crazyflie_ros2_multiranger_wall_following` node that uses laser ranges from the multi-ranger to perform autonomous wall-following. Then, you can just sit back and relax while the map is created for you!
Let’s first try it in simulation, so open up a terminal and source it if you haven’t already (see section of the Simple mapper simulation). Then launch the wall follower ROS 2 launch file:
Take off and wall following will go fully automatic. The simulated Crazyflie in Gazebo will fly forward, stop when it sees a wall with it’s forward range sensor and follow the wall on its left-hand side.
You’ll see on RVIZ2 when the full map is created like here below and this part of the tutorial video.
Screenshot of the simulated Crazyflie in Gazebo mapping will autonomously wall following (video)
You can stop the simulated Crazyflie by the following service call in another terminal that is sourced with ROS 2 humble.
ros2 service call /crazyflie/stop_wall_following std_srvs/srv/Trigger
The simulated Crazyflie will stop wall following and land. You can also just close the simulation, since nothing can happen here.
5. Wall following real world
Now that we have demonstrated that the wall-following works in simulation, we feel confident enough to try it in the real world this time! Make sure you have a fully charged battery, place the Crazyflie on the floor facing the direction you’d like the positive x-axis to be (which is also where it will fly first), and turn it on.
Make sure that you are flying with a room with clear defined walls and corners, or make something with cardboard such as a mini maze, but the current algorithm is optimized to just fly in a squarish room.
Source the ROS 2 workspace like previously and start up the wall follower launch file for the
Like the simulated Crazyflie, the real Crazyflie will take off automatically and automatically do wall following, so it is important that it is flying towards a wall. It should look like this screenshot, or you can check it with this part of the video.
The real crazyflie wall following autonomously while mapping the room (video).
Be careful here to not accidently run this script with the Crazyflie sitting on your desk!
If you’d like the Crazyflie to stop, don’t stop theROS2 nodes with ctrl-c, since it will continue flying until crash. It’s not like simulation unfortunately where you can close the environment and nothing will happen. Instead, use the ROS 2 service made for this in a different terminal:
ros2 service call /crazyflie_real/stop_wall_following std_srvs/srv/Trigger
Similar the real Crazyflie will stop wall following and land. Now you can close the ROS 2 terminals and turn off the crazyflie.
Next steps?
We don’t have any more demos to show but we can give you a list of suggestions of what you could try next! You could for instance have multiple Crazyflies mapping together like in the video shown here:
This uses the mapMergeForMultiRobotMapping-ROS2 external project, which is combined with Crazyswarm2 with this launch file gist. Just keep in mind that, currently, it would be better to use a global positioning system here, such as the Lighthouse positioning system used in the video. Also, if you’d like to try this out in simulation, you’ll need to ensure different namespaces for the Crazyflies, which the current simulation setup may not fully support.
Another idea is to connect the NAV2 stack instead of the simple mapper. There exists a couple of instructions on the Crazyswarm2 ROS2 tutorials so you can use those as reference. Check out the video below here.
Moreover, if you are having difficulties setting up your computer, I’d like to remind you that the skill-learning session we conducted for Robotics Developer Day was entirely done using a ROSject provided by The Construct, which also allows direct connection with the Crazyflie. The only requirement is that you can run Crazyswarm2 on your local machine, but that should be feasible. See the video of the original Robotics Developer Day skill-learning session here:
The last thing to know is that the ROS 2 nodes in this tutorial are running ‘offboard,’ so not on the Crazyflies themselves. However, do check out the Micro-ROS examples for the Crazyflie by Eprosima whenever you have the time and would like to challenge yourself with embedded development.
That’s it, folks! If you are running into any issues with this tutorial or want to bounce some cool ideas to try yourself, start a discussion thread on https://discussions.bitcraze.io/.