Blog

ROSCon is a developer’s conference that focuses entirely on the Robot Operating System (ROS), bringing together developers from around the globe to learn, discuss, and network. It serves as a space for ROS developers to explore the latest advancements, share their experiences, and collaborate on cutting-edge robotics projects. We attended ROSCon 2022 in Japan, and it was a fantastic experience. So when the opportunity came to participate again this year, we couldn’t pass it up! Not only is this a conference that’s been close to our hearts, this year it’s also close to the office: it’s merely a 3hours train ride away.

The 2024 edition is full of promises already, and we’re excited to be a part of it in several ways. We talked about how we helped out the diversity committee already, contributing to efforts that promote a more inclusive and diverse community within the robotics field. Moreover, we will have a booth there. We’ll be located at in Room 2, at booth 21. If you have trouble finding us, just listen closely to the sound of drones buzzing! ! We’ll be showcasing a live demo that’s still under construction. If you’re curious and want to know more about it, just keep an eye on our weekly blogposts to get an update once we finalize our plans.

In addition to being an exhibitor, we also have the honour of presenting a talk. Arnaud will be speaking on October 23 at 14:40 in Large Hall 4. His talk, titled “The Lighthouse Project: From Virtual Reality to Onboard Positioning for Robotics”, will dive into the Lighthouse system, as the title implies. He’ll explain how this technology, originally developed for virtual reality, is being adapted for onboard positioning in various types of robots.

We’re really looking forward to connecting with fellow developers, learning from the presentations, and sharing our own work with the community. If you’re attending ROSCon 2024, be sure to stop by Booth 21 and catch Arnaud’s talk—we can’t wait to see you there!

You might remember that at the beginning of this summer, we were invited to do a skill-learning session with the Crazyflie at the Robotics Developer Day 2024 (see this blog post) organized by The Construct. We showed the Crazyflie flying with the multi-ranger deck, capable of mapping the room in both simulation and the real world. Moreover, we demonstrated this with both manual control and autonomous wall-following. Since then, we wanted to make some improvements to the simulation. We now present an updated tutorial on how to do all of this yourself on your own machine.

This tutorial will focus on using the multi-ranger ROS 2 nodes for both mapping and wall-following in simulation first, before trying it out on the real thing. You will be able to tune settings to your specific environment in simulation first and then use exactly the same nodes in the real world. That is one of the main strengths of ROS, providing you with that flexibility.

We have made a video of what to expect of the tutorial, for which you should use this blogpost for the more detailed instructions.

Watch this video first and then again with the instructions below

What do you need first?

You’ll need to setup some things first on the PC and acquire hardware to follow this tutorial in full:

PC preparation

You’ll need to install ROS 2 and Gazebo simulator maintained by the Open Robotics foundation on an Ubuntu machine.

Hardware

You’ll need to components at least of the STEM ranging bundle

If you have any different setup of your computer, it is okay as the demos should be simple enough to work, but, be prepared for some warning/error handling that this tutorial might have not covered.

Time to complete:

This is an approximation of how much time you need to complete this tutorial, depended on your skill level, but if you already have experience with both ROS 2/Gazebo and the Crazyflie it should take 1 hour.

If you have the Crazyflie for the first time, it would probably be a good idea to go through the getting started tutorial and connect to it with a CFclient with the Flowdeck and Multi-ranger deck attached as a sanity check if everything is working before jumping into ROS 2 and Gazebo.

Some things holds for ROS 2! It would be handy to go through the ROS 2 Humble beginner tutorials before starting.

1. Installation

This section will install 4 packages:

Make the workspaces for both simulation and ROS. You can use a different directory for this

mkdir ~/crazyflie_mapping_demo
cd crazyflie_mapping_demo
mkdir simulation_ws
mkdir ros2_ws
cd ros2_ws
mkdir src

Let’s clone the repositories in their right location, starting with simulation

cd ~/crazyflie_mapping_demo/simulation_ws
git clone https://github.com/bitcraze/crazyflie-simulation.gitCode language: JavaScript (javascript)

Then navigate to the ROS2 workspace source folder and clone 3 projects:

cd ~/crazyflie_mapping_demo/ros2_ws/src
git clone https://github.com/knmcguire/crazyflie_ros2_multiranger.git
git clone https://github.com/knmcguire/ros_gz_crazyflie
git clone https://github.com/IMRCLab/crazyswarm2 --recursiveCode language: PHP (php)

First install certain requirements as apt-get packages and pip libraries (might want to make a python environment for the latter)

sudo apt-get install libboost-program-options-dev libusb-1.0-0-dev python3-colcon-common-extensions
sudo apt-get install ros-humble-motion-capture-tracking ros-humble-tf-transformations
sudo apt-get install ros-humble-ros-gzharmonic ros-humble-teleop-twist-keyboard
pip3 install cflib transform3D Code language: JavaScript (javascript)

Also follow the instructions to give the proper rights to the Crazyradio 2.0 in this guide, but if this is your first time of working with the Crazyradio 2.0 first follow this tutorial.

Go to the ros2_ws workspace and build the packages

cd  ~/crazyflie_mapping_demo/ros2_ws/
source /opt/ros/humble/setup.bash
colcon build --cmake-args -DBUILD_TESTING=ONCode language: JavaScript (javascript)

Building will take a few minutes. Especially Crazyswarm2 will show a lot of warnings and std_err, but unless the package build has ‘failed’, just ignore it for now until we have proposed a fix to that repository.

If the build of all the packages passes and non failed, please continue to the next step!

2. Simple mapping simulation

This section will explain how to create a simple 2D map of your environment using the multi-ranger. The ROS 2 package designed for this is specifically made for the multi-ranger, but it should be compatible with NAV2 if you’d like. However, for now, we’ll focus on a simple version without any localization inferred from the map.

Open up a terminal which needs to be sourced for both the gazebo model and the newly build ROS 2 packages:

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
export GZ_SIM_RESOURCE_PATH="/home/$USER/crazyflie_mapping_demo/simulation_ws/crazyflie-simulation/simulator_files/gazebo/"Code language: JavaScript (javascript)

First lets be safe and start with simulation. Startup the ROS 2 launch files with:

ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_simulation.launch.pyCode language: CSS (css)

If you get a ‘No such file or directory’ error on the model, try entering the full path in GZ_SIM_RESOURCE_PATH export.

Gazebo will start with the Crazyflie in the center. You can get a close-up of the Crazyflie by right-clicking it in the Entity tree and pressing ‘Move to’. You can also choose to follow it, but the camera tracking feature of Gazebo needs some tuning to track something as small as the Crazyflie. Additionally, you will see RVIZ starting with the map view and transforms preconfigured.

Open up another terminal, source the installed ROS 2 distro and open up the ROS 2 teleop keyboard node:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Have the Crazyflie take off with ‘t’ on your keyboard, and rotate it around with the teleop instructions. In RVIZ you should see the map being created and the transform of the Crazyflie moving. You should be able to see this picture, and in this part of the video.

Screenshot of the Crazyflie in Gazebo generating a map with Teleop (video)

3. Simple mapping real world

Now that you got the gist of it, let’s move to the real Crazyflie!

First, if you have a different URI of the Crazyflie to connect to, first change the config file ‘crazyflie_real_crazyswarm2.yaml’ in the crazyflie_ros2_repository. This is a file that Crazyswarm2 uses to know to which Crazyflie to connect to.

Open up the config file in gedit or your favorite IDE like visual code:

gedit ~/crazyflie_mapping_demo/ros2_ws/src/crazyflie_ros2_multiranger/crazyflie_ros2_multiranger_bringup/config/crazyflie_real_crazyswarm2.yamlCode language: JavaScript (javascript)

and change the URI on this line specifically to the URI of your Crazyflie if necessary. Mind that you need to rebuild ros2_ws again to make sure that this has an effect.

Now launch the ROS launch of the simple mapper example for the real world Crazyflie.

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_real.launch.py
Code language: JavaScript (javascript)

Now open up another terminal, source ROS 2 and open up teleop:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Same thing, have the Crazyflie take off with ‘t’, and control it with the instructions.

You should be able to see this on your screen, which you can also check with this part of the video.

Screen shot of the real Crazyflie mapping while being controlled with ROS 2 teleop (video)

Make the Crazyflie land again with ‘b’, and now you can close the ROS 2 node in the launch terminal with ctrl + c.

4. Wall following simulation

Previously, you needed to control the Crazyflie yourself to create the map, but what if you could let the Crazyflie do it on its own? The `crazyflie_ros2_multiranger` package includes a `crazyflie_ros2_multiranger_wall_following` node that uses laser ranges from the multi-ranger to perform autonomous wall-following. Then, you can just sit back and relax while the map is created for you!

Let’s first try it in simulation, so open up a terminal and source it if you haven’t already (see section of the Simple mapper simulation). Then launch the wall follower ROS 2 launch file:

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_simulation.launch.pyCode language: CSS (css)

Take off and wall following will go fully automatic. The simulated Crazyflie in Gazebo will fly forward, stop when it sees a wall with it’s forward range sensor and follow the wall on its left-hand side.

You’ll see on RVIZ2 when the full map is created like here below and this part of the tutorial video.

Screenshot of the simulated Crazyflie in Gazebo mapping will autonomously wall following (video)

You can stop the simulated Crazyflie by the following service call in another terminal that is sourced with ROS 2 humble.

ros2 service call /crazyflie/stop_wall_following std_srvs/srv/Trigger

The simulated Crazyflie will stop wall following and land. You can also just close the simulation, since nothing can happen here.

5. Wall following real world

Now that we have demonstrated that the wall-following works in simulation, we feel confident enough to try it in the real world this time! Make sure you have a fully charged battery, place the Crazyflie on the floor facing the direction you’d like the positive x-axis to be (which is also where it will fly first), and turn it on.

Make sure that you are flying with a room with clear defined walls and corners, or make something with cardboard such as a mini maze, but the current algorithm is optimized to just fly in a squarish room.

Source the ROS 2 workspace like previously and start up the wall follower launch file for the

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_real.launch.pyCode language: CSS (css)

Like the simulated Crazyflie, the real Crazyflie will take off automatically and automatically do wall following, so it is important that it is flying towards a wall. It should look like this screenshot, or you can check it with this part of the video.

The real crazyflie wall following autonomously while mapping the room (video).

Be careful here to not accidently run this script with the Crazyflie sitting on your desk!

If you’d like the Crazyflie to stop, don’t stop the ROS2 nodes with ctrl-c, since it will continue flying until crash. It’s not like simulation unfortunately where you can close the environment and nothing will happen. Instead, use the ROS 2 service made for this in a different terminal:

ros2 service call /crazyflie_real/stop_wall_following std_srvs/srv/Trigger

Similar the real Crazyflie will stop wall following and land. Now you can close the ROS 2 terminals and turn off the crazyflie.

Next steps?

We don’t have any more demos to show but we can give you a list of suggestions of what you could try next! You could for instance have multiple Crazyflies mapping together like in the video shown here:

This uses the mapMergeForMultiRobotMapping-ROS2 external project, which is combined with Crazyswarm2 with this launch file gist. Just keep in mind that, currently, it would be better to use a global positioning system here, such as the Lighthouse positioning system used in the video. Also, if you’d like to try this out in simulation, you’ll need to ensure different namespaces for the Crazyflies, which the current simulation setup may not fully support.

Another idea is to connect the NAV2 stack instead of the simple mapper. There exists a couple of instructions on the Crazyswarm2 ROS2 tutorials so you can use those as reference. Check out the video below here.

Moreover, if you are having difficulties setting up your computer, I’d like to remind you that the skill-learning session we conducted for Robotics Developer Day was entirely done using a ROSject provided by The Construct, which also allows direct connection with the Crazyflie. The only requirement is that you can run Crazyswarm2 on your local machine, but that should be feasible. See the video of the original Robotics Developer Day skill-learning session here:

The last thing to know is that the ROS 2 nodes in this tutorial are running ‘offboard,’ so not on the Crazyflies themselves. However, do check out the Micro-ROS examples for the Crazyflie by Eprosima whenever you have the time and would like to challenge yourself with embedded development.

That’s it, folks! If you are running into any issues with this tutorial or want to bounce some cool ideas to try yourself, start a discussion thread on https://discussions.bitcraze.io/.

Happy hacking!

We are excited to announce the release of our new PID Tuning Guide! This guide is designed to help users understand and apply the basics of PID tuning within our ecosystem, making it easier to achieve stable and responsive flight for your Crazyflie. This guide is particularly useful if you’ve modified your drone, such as adding expansion decks or changing its motor and/or propeller configuration. While our default tuning is designed to work in a wide range of situations and configurations, fine-tuning your PID settings can enhance performance for your specific setup and flight profile.

Interface with tuning toolbox and plotter displaying the roll angle setpoint and the roll angle state estimate.

What’s in the guide?

The guide covers essential topics, including:

  • Fundamental PID Concepts: Understand the role of Proportional, Integral, and Derivative parameters in controlling your Crazyflie’s movements.
  • Step-by-Step Instructions: Learn how to set up your software, and use cfclient for tuning.
  • Practical Tuning Tips: Get insights on adjusting PID gains, using the tuning toolbox, and conducting safe manual flight tests.

Why this guide is useful

Even though this guide focuses on the basics, it provides a solid foundation for anyone new to PID tuning. Whether you’re using the Crazyflie 2.1, Crazyflie 2.0, or a custom-built quadcopter with the Crazyflie Bolt, this guide will help you:

  • Understand how PID controllers work and why they are important.
  • Use the cfclient for PID tuning within our ecosystem.

Safety first

We prioritize safety in our guide. Always secure your quadcopter in a safe environment, use protective gear, and configure an emergency stop on your controller to ensure a safe tuning process.

Get started with PID tuning today!

Ready to improve your quadcopter’s flight performance? Check out our PID Tuning Guide and start tuning.

Ever since we developed the new 47-17 propellers it’s been on our list to update the Crazyflie 2.1 kit. And finally it is here! While we were at it, we also updated the battery which is now 1 gram lighter with the same performance and capacity. These changes will improve the flight duration and thrust with up to 15%.

An assembled Crazyflie 2.1+

At the same time, we will discontinue the Crazyflie 2.1 as the Crazyflie 2.1+ replaces the 2.1. If you still need the old propeller, don’t worry, we will continue selling this.

Increasing prices

As with the rest of the world, we’re feeling the impact of inflation, and like many others, we’re having to make some adjustments to keep up with the rising costs. We’ve done our best to keep things steady, but in order for us to keep developing our products we’ve realized that a small price adjustment across our product line is necessary. So starting today, August 19th, you’ll notice a slight increase of up to 10% on our products.

For the upcoming Crazyflie 2.1 brushless we developed, together with a leading motor manufacturing brand, a brushless 08028 motor, targeting high quality and high efficiency. The 08 – stator size motors are usually optimized for high power output, to serve the FPV market, but we where aiming for high efficiency. This means fitting maximum amount of copper around the stator, lowering KV, thin stator lamination sheets and high quality dual ball-bearings.

Specification

  • Stator size: 08028 (8.4mm x 2.8mm)
  • Stator lamination sheets: 0.2mm
  • Motor KV: 10000
  • Internal resistance: 0.52 Ohm
  • Weight: 2.4g
  • Dual ball-bearing design, using high quality NSK or NMB brands.
  • 1 mm shaft, 5 mm length
  • Matching propeller: Bitcraze 55-35mm
  • Peak current 1.8A, peak power 7.2W -> 30g thurst @ 4V (using 55-35)
  • Rated voltage: 4.2V

Together with the bitcraze 55-35 mm propeller we manage to achieve a system efficiency of over 5 W/g during hover, not to shabby. As a reference, FPV setups normally achieve around 2 W/g. This will bring the hover time for the Crazyflie 2.1 brushless, in the barebone configuration, a bit over 10 minutes.

A few weeks ago, the prestigious Robotics: Science and Systems (RSS) conference was held at Delft University of Technology. We helped with the co-organization of a half-day tutorial and workshop called “Aerial Swarm Tools and Applications” so Kimberly (I) was there on behalf of both Bitcraze and Crazyswarm2. In this blog post, we will tell you a bit about the conference itself and the workshop (and perhaps also a tiny bit about RoboCup)

The Robotics: Science and Systems conference

The Robotics: Science and Systems conference, also known as RSS, is considered one of the most important robotics conferences to attend, alongside ICRA and IROS. It distinguishes itself by having only a single track of presented papers, which makes it possible for all attendees to listen to and learn about all the cool robotics work done in a wide range of fields. It also makes it more difficult to get a paper accepted due to the fixed number of papers they can accept, so you know that whatever gets presented is of high quality.

This year the topic was very much on large language models (LLMs) and their application in robotics, most commonly manipulators. Many researchers are exploring the ways that LLMs could be used for robotics, but that means not a lot of small and embedded systems were represented in these papers. We did find one paper where Crazyflies were presented, namely the awesome work by Darrick et al. (2024) called ‘Stein Variational Ergodic Search’ which used optimal control for path planning to achieve the best coverage.

It gave us the chance to experience many of the other works that could be found at RSS. One in particular was about the robotic design of the cute little biped from Disney Imagineering named “Design and Control of a Bipedal Robotic Character” by Grandia et al. (2024). Also very impressive was the Agile flight demo by the group of Davide Scaramuzza, and we enjoyed listening to the keynote by Dieter Fox, senior director at Nvidia, talking about ‘Where is RobotGPT?’. The banquet location was also very special, as it was located right in the old church of Delft.

You can find all the talks, demos, and papers on the website of RSS 2024

Photos of day 3 of RSS

Aerial Swarm Workshop

The main reason we joined RSS was that we were co-organizing the workshop ‘Aerial Swarm Tools and Applications’. This was done in collaboration with Wolfgang Hönig from Crazyswarm2/TU Berlin, Miguel Fernandez Cortizas and Rafel Perez Segui from Aerostack2/Polytechnic University of Madrid (UPM), and Andrea Testa, Lorenzo Pichierri, and Giuseppe Notarstefano from CrazyChoir/University of Bologna. The workshop was a bit of a hybrid as it contained both talks on various aerial swarm applications and tutorials on the different aerial swarm tools that the committee members were representatives of.

Photos of the Aerial Swarm Tools and Applications workshop

Sabine Hauert from the University of Bristol started off the workshop by talking about “Trustworthy swarms for large-scale environmental monitoring.” Gábor Vásárhelyi from Collmot Robotics and Eötvös University gave a talk/tutorial about Skybrush, showing its suitability not only for drone shows but also for research (Skybrush was used for the Big Loco Test show demo we did 1.5 years ago). The third speaker was SiQi Zhou, speaking on behalf of Angela Schöllig from TU Munich, discussing “Safe Decision-Making for Aerial Swarms – From Reliable Localization to Efficient Coordination.” Martin Saska concluded the workshop with his talk “Onboard relative localization for agile aerial swarming in the wild” about their work at the Czech TU in Prague. They also organize the Multi-robot systems summer school every year, so if you missed it this year, make sure to mark it in your calendar for next summer!

We had four tutorials in the middle of the workshop as well. Gábor also showed Skybrush in simulation after his talk for participants to try out. Additionally, we had tutorials that included real, flying Crazyflies live inside the workshop room! It was a bit of a challenge to set up due to the size of the room we were given, but with the lighthouse system it all worked out! Miguel and Rafael from Aerostack2 were first up, showing a leader-follower demo. Next up were Wolfgang and Kimberly (Crazyswarm2) who showed three Crazyflies collaboratively mapping the room, and finally, Andrea and Lorenzo from CrazyChoir demoed formation control in flight.

You can see the Crazyflies demos flying during the tutorials in the video below. The recording of each of the talks can be found on the workshops website: https://imrclab.github.io/workshop-aerial-swarms-rss2024/

RoboCup 2024 Eindhoven

Luckily, there was also a bit of time to visit Eindhoven for a field trip to the 2024 edition of the world championship competitions of RoboCup! This is a very large robotics competition held in several different divisions, namely Soccer (with many subdivisions), Industrial, Rescue, @Home, and Junior. Each country usually has its own national championships, and those that win there can compete in the big leagues at events like these. RoboCup was extremely fun to attend, so if any robotics enthusiasts happen to live close to one of these, go! It’s awesome.

Photos of the field trip to RoboCup

Maybe drone competitions might be one of RoboCup’s divisions in the future :)

This week, we have a guest blog post from Scott at Droneblocks.

DroneBlocks is a cutting-edge platform that has transformed how educators worldwide enrich STEM programming in their classrooms. As pioneers in the EdTech space, DroneBlocks wrote the playbook on integrating drone technology into STEM curriculum for elementary, middle, and high schools, offering unparalleled resources for teaching everything from computer science to creative arts. What started as free block coding software and video tutorials has become a comprehensive suite of drone and robotics educational solutions. The Block-Coding software still remains free to all, as the DroneBlocks mission has always been to empower educators and students, allowing them to explore and lead the way. This open-source attitude set DroneBlocks on a mission to find the world’s best and most accessible micro-drone for education, and they found it in Sweden!

Previously, DroneBlocks had worked alongside drone juggernaut DJI and their Tello Drone. The Tello was a great tool for its time, but when DJI decided to discontinue it with little input from its partners and users, it made the break much easier. The hunt began for a DJI Tello replacement and an upgrade!

Bitcraze’s choice to build Crazyflie as an open platform had their drone buzzing wherever there was curiosity. The Crazyflie was developed to fly indoors, swarm, and be mechanically simplistic. DroneBlocks established that the ideal classroom micro-drone required similar characteristics. This micro-drone needed to be small for safety but sturdy for durability. It also needed to be easy to assemble and simple in structure for students new to drones. Most importantly, the ideal drone needed to have an open line of software communication to be fully programmable. Finally, there had to be an opportunity for a long-lasting partnership with the drone manufacturer, including government compliance.

After extensive searching and testing by DroneBlocks, the Crazyflie was a diamond in the rough – bite-sized and lightweight, supremely agile and accurate, reliable and robust, and most importantly, it was an open-source development platform. The DroneBlocks development team took the Crazyflie for a spin (or several) and with excitement, it was shared with the larger curriculum team to be mined for learning potential. It was promising to see Crazyflie’s involvement in university-level research studies, which proved it meant business. DroneBlocks knew the Crazyflie had a lot going for it – on its own. The team imagined how, when paired with DroneBlocks’ Block Coding software, Flight Simulator, and Curriculum Specialists, the Crazyflie could soar to atmospheric heights! 

Hardware? Check. Software? Check. But what about compatibility? DroneBlocks was immediately drawn to the open communication and ease of conversation with the Bitcraze team. It was obvious that both Bitcraze and DroneBlocks were born from a common thread and shared a mutual goal: to empower people to explore, investigate, innovate, research, and educate. 

DroneBlocks has since built a new Block Coding interface around the Crazyflie, allowing students to pilot their new drone autonomously and learn the basics of piloting and coding concepts. This interface is offered with a brand new drone coding simulator environment so students can test their code and fly the Crazyflie in a virtual classroom environment.

The Crazyflie curriculum currently consists of courses covering building, configuring, and finally, programming your drone with block coding (DroneBlocks) and Python. DroneBlocks’ expert curriculum team designed these courses to enable learners of all ages and levels to learn step by step through video series and exercises. New courses around block coding and Python are in constant development and will be continuously added to the DroneBlocks curriculum platform.

Crazyflie Drones now headline DroneBlocks’ premiere classroom launch kit. The DroneBlocks Autonomous Drones Level II kit encompasses everything a middle or high school would need to launch a STEM drone program, including the hardware, necessary accessories, and safety wear paired with the DroneBlocks software and curriculum. As a result, thousands of new students have entered the world of Drones and programming thanks to the Bitcraze + DroneBlocks partnership.

DroneBlocks has become an all-inclusive drone education partner for engaging and innovative learning experiences—and the Crazyflie delivers this by being a cutting-edge piece of hardware in a clever package.

Welcome to the “The Beginner’s Guide to Drones” for programmers curious about exploring the world of unmanned aerial vehicles (UAVs). If you’re a coder from another field, this guide will walk you through the basics of drones, their components, and how to start programming them. Let’s dive in and see how your coding skills can take flight!

If you come from an engineering field, you might already know the basics of some of these topics, however you might still have use of the overview and can use the resources to get more specific knowledge.

The Robotics part

First and foremost, you’ll need some basic robotics skills.
We start of with the most basic question, “What is a Robot?”. A robot uses sensors to create an internal model of its environment, and actuators to act on/in its environment. The specifics of the internal model depend on the robot’s purpose, but a crucial component is understanding its location and orientation within that environment.

Linear algebra basics

To understand how a quadcopter perceives its environment and its own position, you’ll need some basic skills in linear algebra, particularly in matrices, vectors, and frame rotation. These skills are essential for comprehending the mathematical principles behind quadcopter navigation.

To build an internal model of its own movement and orientation, an Inertial Measurement Unit (IMU) is used. An IMU consists of a gyroscope, an accelerometer, and sometimes a magnetometer. These sensors, when combined using sensor fusion techniques (See “Control Theroy” below), help determine the quadcopter’s angular velocity and linear acceleration. This data allows the drone to calculate its orientation and movement.

Picture of a Crazyflie in relation to a reference frame from Bitcraze.io

For a detailed understanding of these processes, please refer to the following resources:
Visual description of frame transformations;
https://www.youtube.com/watch?v=kYB8IZa5AuE
Basic coordinate transformations;
https://motion.cs.illinois.edu/RoboticSystems/CoordinateTransformations.htmlbasi

Positioning techniques

The quadcopter can now determine its relationship to its starting position and the gravitational field. However, relying solely on an IMU tends to cause drift over time. Imagine trying to stand on one leg with your eyes closed—eventually, you’ll lose balance.

For improved stability, a drone often needs additional sensors, such as a camera, to help stabilize its position. Other sensor systems can also be used to determine relative or absolute position. While an IMU can sense changes in position relative to a starting point, an external positioning system is necessary for stability and obtaining absolute positions. This system acts as a reference frame for the drone.

Drones flying outdoors typically use GPS combined with RTCM, since it is available almost anywhere, ease to use, and has centimeter-level accuracy.

For indoor use, as with Crazyflies, the default used positioning system is motion-capture system but there are others as well. This area is at the cutting edge of science, with new technologies emerging constantly. However, many effective systems are available, though they may have constraints regarding power efficiency, flight area size, update speed, or precision.

This is an image of some common techniques for positioning. a) AOA, b) TOA, TWR c) TDOA taken from Researchgate.com

Control theory

Now that drone can understand its position and orientation in space, the next step is figuring out how to move within this space. Moving from point A to point B involves setting a “setpoint” and then determining how to use the drone’s actuators to reach this setpoint most efficiently. This is where control theory comes into play.

Drones generally use some sort of feedback control system, which in its most basic form looks something like this:

A very basic overview of a system with feedback control

In this system, the error (the difference between the current position and the setpoint) is fed back into the system to ensure the drone moves in a way that minimizes the error over time.

Various algorithms can calculate the best actuator output based on the error and current state. One of the most fundamental algorithms is the PID controller, which works well for linear systems. Understanding PID controllers requires some basic calculus, but the concept is straightforward. Here are some resources for simple explanations:

For IMUs, there is a particularly useful filter to know about, given its widespread use. The accelerometer and gyroscope each have their own profiles of noise and drift. The accelerometer is sensitive to short-term noise, while the gyroscope drifts slowly over time. To mitigate these issues, a combination of both measurements is often used. The complementary filter is ideal for this situation, leveraging the strengths of both sensors to correct the measurements effectively.

More information on how to use the complementary filter for IMUs can be found here:
https://www.hackster.io/hibit/complementary-filter-and-relative-orientation-with-mpu9250-d4f79d

For more complex scenarios, advanced controllers like Kalman filters and others can be used. It’s also possible to combine multiple controllers to achieve better performance.

Basic overview of feedback control systems;
https://control.com/textbook/closed-loop-control/basic-feedback-control-principles/

The flying part

Now lets get into the exciting part. FLYING!!!

Actuators

Drone actuators, primarily consisting of motors and propellers, are critical for controlling a drone’s movement and stability. The motors and propellers are typically called the “drive train” or “power train”.
The motors used on drones are usually brushed or brushless DC motors. Propellers are attached to the motors and generate lift by pushing air downwards. The size, shape, and pitch of the propellers affect the drone’s performance, including speed, lift, and maneuverability. Together, the precise control of motors and propellers enables a drone to perform complex maneuvers, maintain stability, and achieve efficient flight.

Pictures of forces acting on a drone in flight


This is an almost all you need guide to get an overview of drone flight dynamics
https://dronstechnology.com/the-physics-of-drone-flight-lift-thrust-drag-and-weight/

Stock information

As of now the Crazyflie 2.1 is out of stock, unfortunately. They’re expected back in stock around August 20th – 4 weeks from now. You can sign up in our shop to be notified as soon as they arrive!
https://store.bitcraze.io/collections/kits/products/crazyflie-2-1?variant=19575412719703

Today we welcome Sam Schoedel and Khai Nguyen from Carnegie Mellon University. Enjoy!

We’re excited to share the research we’ve been doing on model-predictive control (MPC) for tiny robots! Our goal was to find a way to compress an MPC solver to a size that would fit on common microcontrollers like the Crazyflie’s STM32F405 while being fast enough to control the higher frequency dynamics of smaller robots. We came up with a few tricks to make that happen and dubbed the resulting solver TinyMPC. When it came time for hardware experiments, using the Crazyflie just made sense. A tiny solver deserves a tiny robot.

Motivation

Model predictive control is a powerful tool for controlling complex systems, but it is computationally expensive and thus often limited to use cases where the robot can either carry enough computational power or when offboard computing is available. The problem becomes challenging to solve for small robots, especially when we want to perform all of the computation onboard. Smaller robots have inherently faster dynamics which require higher frequency controllers to stabilize, and because of their size they don’t have the capacity to haul around as much computational power as their larger robot counterparts. The computers they can carry are often highly memory-constrained as well. Our question was “how can we shrink the computational complexity and memory costs of MPC down to the scale of tiny robots?”

What We Did

We settled on developing a convex model predictive control solver based on the alternating direction method of multipliers. Convex MPC solvers are limited to reasoning about linear dynamics (on top of any other convex constraints), but have structure that TinyMPC exploits to solve problems efficiently. The tricks we used to achieve this efficiency are described in the paper, but it boils down to rewriting the problem as a constrained linear-quadratic regulator to reduce the memory footprint and then precomputing as many matrices as possible offline so that online calculations are less expensive. These tricks allowed us to fit long-time horizon MPC problems on the Crazyflie and solve them fast enough for real-time use.

What TinyMPC Can Do

We decided to demonstrate the constraint-handling capabilities of TinyMPC by having the Crazyflie avoid a dynamic obstacle. We achieved this by re-computing hyperplane constraints (green planes in the first video) about a spherical obstacle (transparent white ball) for each knot point in the trajectory at every time step, and then by solving the problem with the new constraints assuming they stayed fixed for the duration of the solve.

In the two videos below, the reference trajectory used by the solver is just a hover position at the origin for every time step. Also, the path the robot takes in the real world will never be exactly the same as the trajectory computed by the solver, which can easily result in collisions. To avoid this, we inflated the end of the stick (and the simulated obstacle) to act as a keep-out region.

TinyMPC is restricted to reasoning about linear dynamics because of its convex formulation. However, a simple linearization can be taken pretty far. We experimented with recovering from different starting conditions to push the limits of our linear Crazyflie model and were able to successfully recover from a 90 degree angle while obeying the thrust commands for each motor.

We recently added support for second-order cone constraints as well. These types of constraints allow TinyMPC to reason about friction and thrust cones, for example, which means it can now intelligently control quadrupeds on slippery surfaces and land rockets. To clearly demonstrate the cone constraint, we took long exposure photos of the Crazyflie tracking a cylindrical landing trajectory without any cone constraints (red) and then with a spatial cone constraint that restricts the landing maneuver to a glide slope (blue).

How To Use TinyMPC

All of the information regarding the solver can be found on our website and GitHub org (which is where you can also find the main GitHub repository). TinyMPC currently has a Python wrapper that allows for validating the solver and generating C++ code to run on a robot, and we have a few examples in C++ if you don’t want to use Python. Our website also explains how to linearize your robot and has some examples for setting up the problem with a linear model, solving it an MPC loop, and then generating and running C++ code.

Most importantly to the Crazyflie community, our TinyMPC-integrated firmware is available and should work out of the box. Let us know if you use it and run into issues!

Our accompanying research papers:

Khai Nguyen, Sam Schoedel, Anoushka Alavilli, Brian Plancher, and Zachary Manchester. “TinyMPC: Model-Predictive Control on Resource-Constrained Microcontrollers.” arXiv preprint arXiv:2310.16985 (2023). https://arxiv.org/pdf/2310.16985

Sam Schoedel, Khai Nguyen, Elakhya Nedumaran, Brian Plancher, and Zachary Manchester. “Code Generation for Conic Model-Predictive Control on Microcontrollers with TinyMPC.” arXiv preprint arXiv:2403.18149 (2024). https://arxiv.org/pdf/2403.18149

We would love your feedback and suggestions, and let us know if you use TinyMPC for your tiny platforms!

We are happy to announce the latest updates to the Crazyflie client and Python library. Major changes include improved persistent parameter management, enhanced plotting with new x-axis manipulation features, and new default logging configurations (for PID tuning). Minor updates include bug fixes and documentation improvements.

Updated plotter tab. Besides the existing option for a number of samples, users can now set x-axis limits to a number of seconds or a time range.
Updated parameters tab. Users can now mass dump/load persistent parameters to/from a file, or clear all stored persistent parameters.

For detailed release notes, check out crazyflie-lib-python release 0.1.26 and crazyflie-clients-python release 2024.7.