crazyflie

It’s been a while since I last talked about hiring! We successfully onboarded our most recent recruit, and now it’s time to start planning for the future.

One of our challenges as a team is that we’re very heavy on engineers and developers. While that’s fantastic for building products, it means we lack expertise in other important areas. That’s why we’re now shifting our focus to bringing in talent to help fill those gaps. We’ve partnered with a recruitment agency once again to help us find the right people for the job.
We’re currently hiring for two distinct roles—here’s what we’re looking for!

Technical sales lead

You will be responsible for developing and implementing sales strategies while exploring both new and existing markets. You’ll take the lead in driving sales and acquiring new customers, becoming the company’s go-to expert on marketing and sales tactics. Your day-to-day tasks will include supporting business development, optimizing sales processes, and proposing effective marketing strategies. This role is perfect for someone with a background in technical sales with a strong strategic mindset and a sense of responsibility.

You can read more about it here.

Technical success engineer

We’re looking for a Technical Success Engineer to provide our customers with technical guidance and product expertise. This role involves offering first-line support, creating documentation and tutorials, and assisting with tech-focused sales efforts. The goal is to ensure a smooth and seamless customer experience while building strong client relationships. It’s an ideal position for a “social developer”—someone with a solid technical background who also excels in communication and enjoys engaging with others.

You can read more about it here.

Both positions are full-time and based at our office in Malmö, Sweden. If you’re curious about why you should join our team, I’ve already shared some of the many reasons why I love being part of Bitcraze.

If you’re interested or have any questions, please send an email to fredric.vernqvist@techtalents.se or contact us at contact@bitcraze.se.

Today, we’re excited to share research from Vrije Universiteit Amsterdam, ‘From Shadows to Light,’ which presents an innovative swarm robotics approach where nano-drones autonomously track dynamic sources indoors.

Motivation

In dynamic and unpredictable indoor environments, locating moving sources—such as heat, gas, or light—presents unique challenges. GPS-denied settings, in particular, demand innovative and efficient onboard solutions for both control and sensing. Our research demonstrates how small drones, like Crazyflies, can be organized into a coordinated swarm to autonomously locate and follow these sources indoors, relying solely on onboard sensing and communication capabilities. Without sharing individual measurements, each drone adapts its behavior in response to its own sensor readings, allowing the swarm to collectively converge on the center of a light source through modified interactions with nearby agents.

Tugay Alperen (right) and Victor Retamal (left) during ICRA 2024 poster session

Method

Our approach enables each Crazyflie to function autonomously, using onboard sensing combined with continuous inter-agent communication at a frequency of 20 Hz. This methodology is structured around three core components:

Proximal Control and Collective Motion

Each drone broadcasts its position to nearby agents, enabling the calculation of relative positions to maintain safe distances. This proximal control ensures cohesive group movement by computing virtual force vectors for velocity commands, which are sent to onboard controllers operating at 20 Hz.

Source Seeking Through Adaptive Social Proximity

Drones use custom light sensors to detect local light intensity. Instead of directly adjusting positions based on this measurement, each drone modifies its social proximity to neighbors according to the sensed intensity without broadcasting this information. This adaptation allows the swarm to collectively follow the light gradient toward the source in a decentralized manner.

Obstacle Avoidance

Equipped with time-of-flight sensors, each drone independently detects obstacles and adjusts its trajectory to maintain safety. This ensures the swarm remains intact while navigating toward the source.

By combining continuous relative positioning, virtual force-based control, individual sensing, and adaptive social behavior, our methodology provides a robust framework for efficient source seeking in GPS-denied indoor environments.

Experimental Setup

Crazyflie equipped with Flow Deck v2, UWB Deck, Multi-Ranger Deck, and a custom-made deck that produces an analog voltage reading from an LDR for light intensity measurements.

The system architecture allowing us to achieve autonomous flocking and source localization with a swarm of Crazyflie

Our experiments take place in a 7×4.75-meter indoor arena with remotely controlled overhead light bulbs. These bulbs, activated individually or in pairs, create a moving light gradient. We tested our flocking swarm by initially positioning them at the edge of an illuminated area. As the light source shifted, we assessed the swarm’s performance by comparing their trajectories with the known centers of the illuminated areas without waiting for full convergence at each step. We also mapped our environment’s light intensity by moving a single Crazyflie randomly around the flight arena and recording the measurements to later merge on a single map to generate this light intensity heatmap.

The brightness values around the test environments, measured for each light source when only it was active.

Results

The flock flies as an ordered swarm, successfully localizing around the source with the swarm’s centroid positioned at the source center. (The centroid appears as a point without an arrow in the video.)

Even with an obstacle present within or between the illuminated regions, the flock successfully localizes around the center, avoiding the obstacle and maintaining order and cohesion within the swarm. The Multi-Ranger deck provides distance measurements for obstacle detection.

Future Directions

As the next step, we plan to apply our highly generalizable algorithm to various source types, including gas sources, radio signals, and similar sources that provide only scalar strength measurements rather than directional cues. Additionally, we have demonstrated that our flocking and source localization algorithms work effectively in 3D. We aim to showcase a fully functional application with a 3D-localized source and a flocking swarm operating in 3D space. Finally, we are working toward achieving fully onboard relative localization, which would eliminate the need for any indoor positioning system. This advancement would allow our swarm to operate autonomously in any environment, replicating the same behavior wherever it is deployed.

Links

The authors were with the Vrije Universiteit Amsterdam.

Please feel free to contact us with any questions or ideas: t.a.karaguzel@vu.nl

Please cite this as:

@ARTICLE{10314746,
  author={Karagüzel, Tugay Alperen and Retamal, Victor and Cambier, Nicolas and Ferrante, Eliseo},
  journal={IEEE Robotics and Automation Letters}, 
  title={From Shadows to Light: A Swarm Robotics Approach With Onboard Control for Seeking Dynamic Sources in Constrained Environments}, 
  year={2024},
  volume={9},
  number={1},
  pages={127-134},
  keywords={Robot sensing systems;Autonomous aerial vehicles;Position measurement;Vehicle dynamics;Sensors;Location awareness;Drones;Swarm robotics;aerial systems: perception and autonomy;multi-robot systems},
  doi={10.1109/LRA.2023.3331897}}

It’s been 2 weeks ago that we went to ROSCon ’24 in Odense Denmark as exhibitor and silver sponsor! Since it was a 2-hour train ride for us, it made much sense for us to attend this as a company and we are very happy we did. In this blog post we are sharing our experiences of the event.

The Booth Build-up and Demo

We made some changes to our well-known cage that is a must at every conference we have exhibited. Usually, it would take us a good few hours just to set up the cage alone, but we have improved the corners which improved our build-up experience quite a lot and we were done within an hour! Just in time for us to join the tours and bird of feather sessions with no stress!

All done before 11 am!

For ROSCon we prepared a more ROSflavored demo that enabled full demo control from ROS, which was based on the swarming mapping demo shown in this tutorial and the robotics developer day (see this video). Here we already hit a couple of issues that all had to do with the differences between demos for exhibitions versus one-time talk demos (see OpenCV! Live episode where we talked about demo driven development). We switched back to our usual fully decentralized autonomous swarm demo (see this blog post). Luckily, the Crazyflie could still communicate at the same time to give through the multiranger values, such that the computer could still generate the Swarm merging map while the Crazyflies were flying around avoiding each other.

Exhibition Booth

Tuesday and Wednesday were the actually exhibition days so that is when we talked with most of the people. It was a bit slow in the beginning as we were located at the end of the hall, but luckily the ROSCon passport game motivated people to go by each of the booth to get a stamp. We went a bit rogue and made our own much bigger stamp ;) but luckily it still fit as long as we aligned properly. We donated a STEM Ranging bundle as one of the prizes to congratulate whoever won this! And now they can try out this ROS tutorial ;)

Talking to people outside and inside the cage

We noticed that the Crazyflie Brushless got a lot of attention. The ability to carry more than a regular Crazyflie seemed of great interest to many of the ROSCon attendees. Moreover, the prototype of the forward-facing expansion connector (a.k.a. the Camera deck) was also a well-requested feature of the Crazyflie and has solidified our belief that the community needs something like this as well. In general, the lighthouse positioning system and the stand-alone lighthouse node were also quite well received. Luckily we were able to forward people to our accepted talk about the Lighthouse position system on Thursday

Lighthouse Positioning Talk

One of the reasons we were present at ROSCon 2024 was to gauge the interest of the general robotics community in the lighthouse positioning system. We have been using it for years for the Crazyflies, but we’d like to also evangelize its submillimeter and cost-effective awesomeness for any other platform. And there seems to be quite some interest for it! We gave a short presentation on Thursday Afternoon during the ‘ROS Tooling & Testing’ session (we will share the recording once it becomes available).

Talk about Lighthouse Positioning – Taken by Dharini Dutia from Women in Robotics

We also send out some polls just to see what kind of positioning systems are used and for what purpose. It was evident that there are many outdoor roboticists that also use onboard-sensing based state-estimation like SLAM, but there was still a significant portion of people that used indoor positioning systems for the actual positioning replacement and/or Ground truth. And also we got some valuable feedback, like if it would still work out with a Lidar or Kinect, or if it is suitable for a 12-meter size robot (wow). We will take this all in for improvements for any new upgrades to the lighthouse deck and stand-alone nodes for it. Thanks to you all for providing all the feedback and the interest!

Side-events

We also attended a couple of events related to ROSCon 2024. Marcus and Kimberly both attended tours of Odense Robotics, Universal robots and Teradyne facilities. The tour of the SDU Drone Center was particularly impressive. Moreover, we also attended the Aerial Robotics Meetup, who attracted about 90-100 people at the max, with drinks and snacks provided by Dronecode Foundation. It was great to see such a big aerial presence at ROSCon. There was also the Karaoke meetup, the ROSCon afterparty by Odense Robotics with a beer-serving robot arm, the Women in Robotics lunch… there was just too much to attend to but it all was a great success!

Check out the ROSCon 2024 event page on our website of what we have shown at ROSCon 2024 and see more information about the demos/products we had there.

We are excited to announce that we are working on several new link performance metrics for the Crazyflie that will simplify the troubleshooting of communication issues. Until now, users have had access to very limited information about communication links, relying primarily on a “link quality” statistic based on packet retries (when we have to re-send data) and an RSSI channel scan. Our nightly tests have been limited to basic bandwidth and latency testing. With this update, we aim to expose richer data that not only enables users to make more informed decisions regarding communication links but also enhances the effectiveness of our nightly testing process. In this blog post, we will explore the new metrics, the rationale behind their introduction, and how they will improve your interaction with the Crazyflie. Additionally, we will be holding a developer meeting on Wednesday November 13th to discuss these updates in more detail, and we encourage you to join us!

“Link Quality”—All or Nothing

Until now, users of the Crazyflie have had access to a single link quality metric. Implemented in the Python library, this metric is based on packet retries—instances when data packets need to be re-sent due to communication issues. This metric indicates that for every retry, the link quality drops by 10%, with a maximum of 3 retries allowed. As a result, the link quality score usually ranges from 70% to 100%, with a drop to 0% when communication is completely lost. However, as packet loss occurs, users often experience a steep decline, commonly seeing 100% when packets are successfully acknowledged or dropping to 0% when communication is completely lost.

Client representation of link quality; no link, yes link

The current link quality metric has served as a basic indicator but provides limited insight, often making it difficult to gauge communication reliability accurately. Recognizing these limitations, we’re introducing several new link performance metrics to the Crazyflie Python library, designed to provide a far more detailed and actionable view of communication performance.

What’s Coming in the Upcoming Update

The first metric we are adding is latency. We measure the full link latency, capturing the round-trip time through the library, to the Crazyflie, and back. This latency measurement is link-independent, meaning it applies to both radio and USB connections. The latency metric exposed to users will reflect the 95th percentile—a commonly used measure for capturing typical latency under normal conditions.

Next are several metrics that (currently) only support the radio link. For these, we distinguish between uplink (from the radio to the Crazyflie) and downlink (from the Crazyflie to the radio).

The first is packet rate, which simply measures the number of packets sent and received per second.

More interestingly, we are introducing a link congestion metric. Whenever there is no data to send, both the radio and the Crazyflie send “null” packets. By calculating the ratio of null packets to the total packets sent or received, we can estimate congestion. This is particularly useful for users who rely heavily on logging parameters or, for example, stream mocap positioning data to the Crazyflie.

The Received Signal Strength Indicator (RSSI) measures the quality of signal reception. Unlike our current “link quality” metric, we hope that a poor RSSI will serve as an early warning signal for potential communication loss. While RSSI tracking has been possible before with the channel scan example, this update will monitor RSSI in the library by default, and expose it to the user. The nRF firmware will also be updated to report RSSI by default. Currently, we only receive uplink RSSI, that is, RSSI measured on the Crazyflie side.

Work in progress client representation of new link performance metrics

We’ve already found these new metrics invaluable at Bitcraze. While we have, of course, measured various parameters throughout development, it was easy to lose track of the precise status of the communication stack. In the past, we relied more on general impressions of performance, but with these new metrics, we’ve gained a clearer picture. They’ve already shed light on areas like swarm latency, helping us fine-tune and understand performance far better than before.

You can follow progress on GitHub, and we invite you to try out these metrics for yourself. If there’s anything you feel is missing, or if you have feedback on what would make these tools even more helpful, we’d love to hear from you. Hit us up over on GitHub or join the developer meeting on Wednesday the 13th of November (see the join information on discussions).

We are happy to announce that release 2024.10 is now available! Special thanks to our community contributors for their valuable input and code contributions in this release!

Release overview

crazyflie-firmware release 2024.10 GitHub

crazyflie2-nrf-firmware release 2024.10 GitHub

crazyflie2-nrf-bootloader release 2024.10 GitHub

cfclient (crazyflie-clients-python) release 2024.10 GitHub, PyPI

cflib (crazyflie-lib-python) release 0.1.27 on GitHub, PyPI

User upgrade notice

While older versions may still function, users are encouraged to upgrade:

  • Minimum supported Python version changed to 3.10
  • Supported Ubuntu versions changed to 22.04 and 24.04

Major changes

  • Enhanced out-of-tree (OOT) kbuild configuration, allowing users to perform full Kconfig configuration for app layer applications.
  • Introduced recovery functionality, allowing users or scripts to safely re-enable the system after a crash without reboot.
  • Added a timeout for auto-disarming, allowing the system to remain armed during brief landings in manual arming mode.
  • Introduced a workaround for PID derivative kick, improving the performance of the PID controller during large setpoint changes (#1337, #1403).
  • Spiral and constant velocity high-level commander segments (#1410).
  • Changed BLE name format to include part of the NRF MAC address, allowing users to easily differentiate between Crazyflies.

For detailed release notes, check out the individual releases on GitHub. Links can be found in the release overview above.

We have some very busy weeks behind us and ahead! As we are working hard on releasing the new CF Brushless, we have been preparing for the upcoming ROSCon in Odense Denmark next week (see this previous blogpost) and we also featured on the latest OpenCV live episode as well! So more about both in this blogpost.

OpenCV Live! Demo Driven Development

We were featured as guests on the latest OpenCV Live! episode hosted by Phil Nelson and Satya Mallick, where we went through a bit of the history of the start of Bitcraze and all of the (crazy) demos done with the Crazyflie in the last decade. We have done a similar topic for our latest developer meeting, but for this episode we put the focus more on vision based demos, since OpenCV has been definitely used in the past at Bitcraze for various reasons! Just type in OpenCV in the top right search barto check out any of the blogs we have written.

During the OpenCV live episode of the 10th of October, Arnaud and Kimberly told the backstories of these demos that went from a manual flight fail where Arnaud flew the Crazyflie 1.0 in Marcus’ hair, using OpenCV and Aruco markers for positioning to flying a swarm in your kitchen. It was really fun to do and alos one lucky listener managed to answer the two questions the host Phil asked at the end, namely “Where does the name Crazyflie come from?” and “Why is the last part (‘-flie’) spelled this way?” and won a STEM ranging bundle. If you’d like to know the answers, go and watch the latest OpenCV! Live episode ;) Enjoy!

ROSCon – What to expect?

So next week we will be present as Silver Sponsor at ROSCon Odense, namely on Monday 21th and Wednesday 23rd of October. The Bitcraze booth will be located on number 21 so that should be near the coffee break place! We will have are old trusty cage with some upgrades with a nice ROS demo which is similar to the one explained in this Crazyflie ROS tutorial we have written a while ago, but then the swarming variant of it. We also hope to show a Brushless Crazyflie Prototype, and a new camera deck prototype, along with anything else we can find lying around at our office :D.

Moreover, Arnaud will be given a presentation on the lighthouse positioning system, namely at Wednesday 23rd of October 14:40 (2:30 pm) called ‘The Lighthouse project: from Virtual Reality to Onboard Positioning for Robotics’. The lighthouse positioning system will also be the system that we will demo at our booth so if you’d like to see it for yourself, or perhaps (during downtime) hack around together with us, you are more than welcome to do so! Check out the Bitcraze ROSCon Eventpage for more details about our demo or the hardware we will show.

It’s now become a tradition to create a video compilation showcasing the most visually stunning research projects that feature the Crazyflie. Since our last update, so many incredible things have happened that we felt it was high time to share a fresh collection.

As always, the toughest part of creating these videos is selecting which projects to highlight. There are so many fantastic Crazyflie videos out there that if we included them all, the final compilation would last for hours! If you’re interested, you can find a more extensive list of our products used in research here.

The video covers 2023 and 2024 so far. We were once again amazed by the incredible things the community has accomplished with the Crazyflie. In the selection, you can see the broad range of research subjects the Crazyflie can be a part of. It has been used in mapping, or swarms – even in heterogeneous swarms! With its small size, it has also been picked for human-robot interaction projects (including our very own Joseph La Delfa showcasing his work). And it’s even been turned into a hopping quadcopter!

Here is a list of all the research that has been included in the video:

But enough talking, the best way to show you everything is to actually watch the video:

A huge thank you to all the researchers we reached out to and who agreed to showcase their work! We’re especially grateful for the incredible footage you shared with us—some of it was new to us, and it truly adds to the richness of the compilation. Your contributions help highlight the fantastic innovations happening within the Crazyflie community. Let’s hope the next compilation also shows projects with the Brushless!

You might remember that at the beginning of this summer, we were invited to do a skill-learning session with the Crazyflie at the Robotics Developer Day 2024 (see this blog post) organized by The Construct. We showed the Crazyflie flying with the multi-ranger deck, capable of mapping the room in both simulation and the real world. Moreover, we demonstrated this with both manual control and autonomous wall-following. Since then, we wanted to make some improvements to the simulation. We now present an updated tutorial on how to do all of this yourself on your own machine.

This tutorial will focus on using the multi-ranger ROS 2 nodes for both mapping and wall-following in simulation first, before trying it out on the real thing. You will be able to tune settings to your specific environment in simulation first and then use exactly the same nodes in the real world. That is one of the main strengths of ROS, providing you with that flexibility.

We have made a video of what to expect of the tutorial, for which you should use this blogpost for the more detailed instructions.

Watch this video first and then again with the instructions below

What do you need first?

You’ll need to setup some things first on the PC and acquire hardware to follow this tutorial in full:

PC preparation

You’ll need to install ROS 2 and Gazebo simulator maintained by the Open Robotics foundation on an Ubuntu machine.

Hardware

You’ll need to components at least of the STEM ranging bundle

If you have any different setup of your computer, it is okay as the demos should be simple enough to work, but, be prepared for some warning/error handling that this tutorial might have not covered.

Time to complete:

This is an approximation of how much time you need to complete this tutorial, depended on your skill level, but if you already have experience with both ROS 2/Gazebo and the Crazyflie it should take 1 hour.

If you have the Crazyflie for the first time, it would probably be a good idea to go through the getting started tutorial and connect to it with a CFclient with the Flowdeck and Multi-ranger deck attached as a sanity check if everything is working before jumping into ROS 2 and Gazebo.

Some things holds for ROS 2! It would be handy to go through the ROS 2 Humble beginner tutorials before starting.

1. Installation

This section will install 4 packages:

Make the workspaces for both simulation and ROS. You can use a different directory for this

mkdir ~/crazyflie_mapping_demo
cd crazyflie_mapping_demo
mkdir simulation_ws
mkdir ros2_ws
cd ros2_ws
mkdir src

Let’s clone the repositories in their right location, starting with simulation

cd ~/crazyflie_mapping_demo/simulation_ws
git clone https://github.com/bitcraze/crazyflie-simulation.gitCode language: JavaScript (javascript)

Then navigate to the ROS2 workspace source folder and clone 3 projects:

cd ~/crazyflie_mapping_demo/ros2_ws/src
git clone https://github.com/knmcguire/crazyflie_ros2_multiranger.git
git clone https://github.com/knmcguire/ros_gz_crazyflie
git clone https://github.com/IMRCLab/crazyswarm2 --recursiveCode language: PHP (php)

First install certain requirements as apt-get packages and pip libraries (might want to make a python environment for the latter)

sudo apt-get install libboost-program-options-dev libusb-1.0-0-dev python3-colcon-common-extensions
sudo apt-get install ros-humble-motion-capture-tracking ros-humble-tf-transformations
sudo apt-get install ros-humble-ros-gzharmonic ros-humble-teleop-twist-keyboard
pip3 install cflib transform3D Code language: JavaScript (javascript)

Also follow the instructions to give the proper rights to the Crazyradio 2.0 in this guide, but if this is your first time of working with the Crazyradio 2.0 first follow this tutorial.

Go to the ros2_ws workspace and build the packages

cd  ~/crazyflie_mapping_demo/ros2_ws/
source /opt/ros/humble/setup.bash
colcon build --cmake-args -DBUILD_TESTING=ONCode language: JavaScript (javascript)

Building will take a few minutes. Especially Crazyswarm2 will show a lot of warnings and std_err, but unless the package build has ‘failed’, just ignore it for now until we have proposed a fix to that repository.

If the build of all the packages passes and non failed, please continue to the next step!

2. Simple mapping simulation

This section will explain how to create a simple 2D map of your environment using the multi-ranger. The ROS 2 package designed for this is specifically made for the multi-ranger, but it should be compatible with NAV2 if you’d like. However, for now, we’ll focus on a simple version without any localization inferred from the map.

Open up a terminal which needs to be sourced for both the gazebo model and the newly build ROS 2 packages:

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
export GZ_SIM_RESOURCE_PATH="/home/$USER/crazyflie_mapping_demo/simulation_ws/crazyflie-simulation/simulator_files/gazebo/"Code language: JavaScript (javascript)

First lets be safe and start with simulation. Startup the ROS 2 launch files with:

ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_simulation.launch.pyCode language: CSS (css)

If you get a ‘No such file or directory’ error on the model, try entering the full path in GZ_SIM_RESOURCE_PATH export.

Gazebo will start with the Crazyflie in the center. You can get a close-up of the Crazyflie by right-clicking it in the Entity tree and pressing ‘Move to’. You can also choose to follow it, but the camera tracking feature of Gazebo needs some tuning to track something as small as the Crazyflie. Additionally, you will see RVIZ starting with the map view and transforms preconfigured.

Open up another terminal, source the installed ROS 2 distro and open up the ROS 2 teleop keyboard node:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Have the Crazyflie take off with ‘t’ on your keyboard, and rotate it around with the teleop instructions. In RVIZ you should see the map being created and the transform of the Crazyflie moving. You should be able to see this picture, and in this part of the video.

Screenshot of the Crazyflie in Gazebo generating a map with Teleop (video)

3. Simple mapping real world

Now that you got the gist of it, let’s move to the real Crazyflie!

First, if you have a different URI of the Crazyflie to connect to, first change the config file ‘crazyflie_real_crazyswarm2.yaml’ in the crazyflie_ros2_repository. This is a file that Crazyswarm2 uses to know to which Crazyflie to connect to.

Open up the config file in gedit or your favorite IDE like visual code:

gedit ~/crazyflie_mapping_demo/ros2_ws/src/crazyflie_ros2_multiranger/crazyflie_ros2_multiranger_bringup/config/crazyflie_real_crazyswarm2.yamlCode language: JavaScript (javascript)

and change the URI on this line specifically to the URI of your Crazyflie if necessary. Mind that you need to rebuild ros2_ws again to make sure that this has an effect.

Now launch the ROS launch of the simple mapper example for the real world Crazyflie.

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_real.launch.py
Code language: JavaScript (javascript)

Now open up another terminal, source ROS 2 and open up teleop:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Same thing, have the Crazyflie take off with ‘t’, and control it with the instructions.

You should be able to see this on your screen, which you can also check with this part of the video.

Screen shot of the real Crazyflie mapping while being controlled with ROS 2 teleop (video)

Make the Crazyflie land again with ‘b’, and now you can close the ROS 2 node in the launch terminal with ctrl + c.

4. Wall following simulation

Previously, you needed to control the Crazyflie yourself to create the map, but what if you could let the Crazyflie do it on its own? The `crazyflie_ros2_multiranger` package includes a `crazyflie_ros2_multiranger_wall_following` node that uses laser ranges from the multi-ranger to perform autonomous wall-following. Then, you can just sit back and relax while the map is created for you!

Let’s first try it in simulation, so open up a terminal and source it if you haven’t already (see section of the Simple mapper simulation). Then launch the wall follower ROS 2 launch file:

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_simulation.launch.pyCode language: CSS (css)

Take off and wall following will go fully automatic. The simulated Crazyflie in Gazebo will fly forward, stop when it sees a wall with it’s forward range sensor and follow the wall on its left-hand side.

You’ll see on RVIZ2 when the full map is created like here below and this part of the tutorial video.

Screenshot of the simulated Crazyflie in Gazebo mapping will autonomously wall following (video)

You can stop the simulated Crazyflie by the following service call in another terminal that is sourced with ROS 2 humble.

ros2 service call /crazyflie/stop_wall_following std_srvs/srv/Trigger

The simulated Crazyflie will stop wall following and land. You can also just close the simulation, since nothing can happen here.

5. Wall following real world

Now that we have demonstrated that the wall-following works in simulation, we feel confident enough to try it in the real world this time! Make sure you have a fully charged battery, place the Crazyflie on the floor facing the direction you’d like the positive x-axis to be (which is also where it will fly first), and turn it on.

Make sure that you are flying with a room with clear defined walls and corners, or make something with cardboard such as a mini maze, but the current algorithm is optimized to just fly in a squarish room.

Source the ROS 2 workspace like previously and start up the wall follower launch file for the

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_real.launch.pyCode language: CSS (css)

Like the simulated Crazyflie, the real Crazyflie will take off automatically and automatically do wall following, so it is important that it is flying towards a wall. It should look like this screenshot, or you can check it with this part of the video.

The real crazyflie wall following autonomously while mapping the room (video).

Be careful here to not accidently run this script with the Crazyflie sitting on your desk!

If you’d like the Crazyflie to stop, don’t stop the ROS2 nodes with ctrl-c, since it will continue flying until crash. It’s not like simulation unfortunately where you can close the environment and nothing will happen. Instead, use the ROS 2 service made for this in a different terminal:

ros2 service call /crazyflie_real/stop_wall_following std_srvs/srv/Trigger

Similar the real Crazyflie will stop wall following and land. Now you can close the ROS 2 terminals and turn off the crazyflie.

Next steps?

We don’t have any more demos to show but we can give you a list of suggestions of what you could try next! You could for instance have multiple Crazyflies mapping together like in the video shown here:

This uses the mapMergeForMultiRobotMapping-ROS2 external project, which is combined with Crazyswarm2 with this launch file gist. Just keep in mind that, currently, it would be better to use a global positioning system here, such as the Lighthouse positioning system used in the video. Also, if you’d like to try this out in simulation, you’ll need to ensure different namespaces for the Crazyflies, which the current simulation setup may not fully support.

Another idea is to connect the NAV2 stack instead of the simple mapper. There exists a couple of instructions on the Crazyswarm2 ROS2 tutorials so you can use those as reference. Check out the video below here.

Moreover, if you are having difficulties setting up your computer, I’d like to remind you that the skill-learning session we conducted for Robotics Developer Day was entirely done using a ROSject provided by The Construct, which also allows direct connection with the Crazyflie. The only requirement is that you can run Crazyswarm2 on your local machine, but that should be feasible. See the video of the original Robotics Developer Day skill-learning session here:

The last thing to know is that the ROS 2 nodes in this tutorial are running ‘offboard,’ so not on the Crazyflies themselves. However, do check out the Micro-ROS examples for the Crazyflie by Eprosima whenever you have the time and would like to challenge yourself with embedded development.

That’s it, folks! If you are running into any issues with this tutorial or want to bounce some cool ideas to try yourself, start a discussion thread on https://discussions.bitcraze.io/.

Happy hacking!

We are excited to announce the release of our new PID Tuning Guide! This guide is designed to help users understand and apply the basics of PID tuning within our ecosystem, making it easier to achieve stable and responsive flight for your Crazyflie. This guide is particularly useful if you’ve modified your drone, such as adding expansion decks or changing its motor and/or propeller configuration. While our default tuning is designed to work in a wide range of situations and configurations, fine-tuning your PID settings can enhance performance for your specific setup and flight profile.

Interface with tuning toolbox and plotter displaying the roll angle setpoint and the roll angle state estimate.

What’s in the guide?

The guide covers essential topics, including:

  • Fundamental PID Concepts: Understand the role of Proportional, Integral, and Derivative parameters in controlling your Crazyflie’s movements.
  • Step-by-Step Instructions: Learn how to set up your software, and use cfclient for tuning.
  • Practical Tuning Tips: Get insights on adjusting PID gains, using the tuning toolbox, and conducting safe manual flight tests.

Why this guide is useful

Even though this guide focuses on the basics, it provides a solid foundation for anyone new to PID tuning. Whether you’re using the Crazyflie 2.1, Crazyflie 2.0, or a custom-built quadcopter with the Crazyflie Bolt, this guide will help you:

  • Understand how PID controllers work and why they are important.
  • Use the cfclient for PID tuning within our ecosystem.

Safety first

We prioritize safety in our guide. Always secure your quadcopter in a safe environment, use protective gear, and configure an emergency stop on your controller to ensure a safe tuning process.

Get started with PID tuning today!

Ready to improve your quadcopter’s flight performance? Check out our PID Tuning Guide and start tuning.

For the upcoming Crazyflie 2.1 brushless we developed, together with a leading motor manufacturing brand, a brushless 08028 motor, targeting high quality and high efficiency. The 08 – stator size motors are usually optimized for high power output, to serve the FPV market, but we where aiming for high efficiency. This means fitting maximum amount of copper around the stator, lowering KV, thin stator lamination sheets and high quality dual ball-bearings.

Specification

  • Stator size: 08028 (8.4mm x 2.8mm)
  • Stator lamination sheets: 0.2mm
  • Motor KV: 10000
  • Internal resistance: 0.52 Ohm
  • Weight: 2.4g
  • Dual ball-bearing design, using high quality NSK or NMB brands.
  • 1 mm shaft, 5 mm length
  • Matching propeller: Bitcraze 55-35mm
  • Peak current 1.8A, peak power 7.2W -> 30g thurst @ 4V (using 55-35)
  • Rated voltage: 4.2V

Together with the bitcraze 55-35 mm propeller we manage to achieve a system efficiency of over 5 W/g during hover, not to shabby. As a reference, FPV setups normally achieve around 2 W/g. This will bring the hover time for the Crazyflie 2.1 brushless, in the barebone configuration, a bit over 10 minutes.