Category: Random stuff

It’s been a while since I last talked about hiring! We successfully onboarded our most recent recruit, and now it’s time to start planning for the future.

One of our challenges as a team is that we’re very heavy on engineers and developers. While that’s fantastic for building products, it means we lack expertise in other important areas. That’s why we’re now shifting our focus to bringing in talent to help fill those gaps. We’ve partnered with a recruitment agency once again to help us find the right people for the job.
We’re currently hiring for two distinct roles—here’s what we’re looking for!

Technical sales lead

You will be responsible for developing and implementing sales strategies while exploring both new and existing markets. You’ll take the lead in driving sales and acquiring new customers, becoming the company’s go-to expert on marketing and sales tactics. Your day-to-day tasks will include supporting business development, optimizing sales processes, and proposing effective marketing strategies. This role is perfect for someone with a background in technical sales with a strong strategic mindset and a sense of responsibility.

You can read more about it here.

Technical success engineer

We’re looking for a Technical Success Engineer to provide our customers with technical guidance and product expertise. This role involves offering first-line support, creating documentation and tutorials, and assisting with tech-focused sales efforts. The goal is to ensure a smooth and seamless customer experience while building strong client relationships. It’s an ideal position for a “social developer”—someone with a solid technical background who also excels in communication and enjoys engaging with others.

You can read more about it here.

Both positions are full-time and based at our office in Malmö, Sweden. If you’re curious about why you should join our team, I’ve already shared some of the many reasons why I love being part of Bitcraze.

If you’re interested or have any questions, please send an email to fredric.vernqvist@techtalents.se or contact us at contact@bitcraze.se.

It’s been 2 weeks ago that we went to ROSCon ’24 in Odense Denmark as exhibitor and silver sponsor! Since it was a 2-hour train ride for us, it made much sense for us to attend this as a company and we are very happy we did. In this blog post we are sharing our experiences of the event.

The Booth Build-up and Demo

We made some changes to our well-known cage that is a must at every conference we have exhibited. Usually, it would take us a good few hours just to set up the cage alone, but we have improved the corners which improved our build-up experience quite a lot and we were done within an hour! Just in time for us to join the tours and bird of feather sessions with no stress!

All done before 11 am!

For ROSCon we prepared a more ROSflavored demo that enabled full demo control from ROS, which was based on the swarming mapping demo shown in this tutorial and the robotics developer day (see this video). Here we already hit a couple of issues that all had to do with the differences between demos for exhibitions versus one-time talk demos (see OpenCV! Live episode where we talked about demo driven development). We switched back to our usual fully decentralized autonomous swarm demo (see this blog post). Luckily, the Crazyflie could still communicate at the same time to give through the multiranger values, such that the computer could still generate the Swarm merging map while the Crazyflies were flying around avoiding each other.

Exhibition Booth

Tuesday and Wednesday were the actually exhibition days so that is when we talked with most of the people. It was a bit slow in the beginning as we were located at the end of the hall, but luckily the ROSCon passport game motivated people to go by each of the booth to get a stamp. We went a bit rogue and made our own much bigger stamp ;) but luckily it still fit as long as we aligned properly. We donated a STEM Ranging bundle as one of the prizes to congratulate whoever won this! And now they can try out this ROS tutorial ;)

Talking to people outside and inside the cage

We noticed that the Crazyflie Brushless got a lot of attention. The ability to carry more than a regular Crazyflie seemed of great interest to many of the ROSCon attendees. Moreover, the prototype of the forward-facing expansion connector (a.k.a. the Camera deck) was also a well-requested feature of the Crazyflie and has solidified our belief that the community needs something like this as well. In general, the lighthouse positioning system and the stand-alone lighthouse node were also quite well received. Luckily we were able to forward people to our accepted talk about the Lighthouse position system on Thursday

Lighthouse Positioning Talk

One of the reasons we were present at ROSCon 2024 was to gauge the interest of the general robotics community in the lighthouse positioning system. We have been using it for years for the Crazyflies, but we’d like to also evangelize its submillimeter and cost-effective awesomeness for any other platform. And there seems to be quite some interest for it! We gave a short presentation on Thursday Afternoon during the ‘ROS Tooling & Testing’ session (we will share the recording once it becomes available).

Talk about Lighthouse Positioning – Taken by Dharini Dutia from Women in Robotics

We also send out some polls just to see what kind of positioning systems are used and for what purpose. It was evident that there are many outdoor roboticists that also use onboard-sensing based state-estimation like SLAM, but there was still a significant portion of people that used indoor positioning systems for the actual positioning replacement and/or Ground truth. And also we got some valuable feedback, like if it would still work out with a Lidar or Kinect, or if it is suitable for a 12-meter size robot (wow). We will take this all in for improvements for any new upgrades to the lighthouse deck and stand-alone nodes for it. Thanks to you all for providing all the feedback and the interest!

Side-events

We also attended a couple of events related to ROSCon 2024. Marcus and Kimberly both attended tours of Odense Robotics, Universal robots and Teradyne facilities. The tour of the SDU Drone Center was particularly impressive. Moreover, we also attended the Aerial Robotics Meetup, who attracted about 90-100 people at the max, with drinks and snacks provided by Dronecode Foundation. It was great to see such a big aerial presence at ROSCon. There was also the Karaoke meetup, the ROSCon afterparty by Odense Robotics with a beer-serving robot arm, the Women in Robotics lunch… there was just too much to attend to but it all was a great success!

Check out the ROSCon 2024 event page on our website of what we have shown at ROSCon 2024 and see more information about the demos/products we had there.

We have some very busy weeks behind us and ahead! As we are working hard on releasing the new CF Brushless, we have been preparing for the upcoming ROSCon in Odense Denmark next week (see this previous blogpost) and we also featured on the latest OpenCV live episode as well! So more about both in this blogpost.

OpenCV Live! Demo Driven Development

We were featured as guests on the latest OpenCV Live! episode hosted by Phil Nelson and Satya Mallick, where we went through a bit of the history of the start of Bitcraze and all of the (crazy) demos done with the Crazyflie in the last decade. We have done a similar topic for our latest developer meeting, but for this episode we put the focus more on vision based demos, since OpenCV has been definitely used in the past at Bitcraze for various reasons! Just type in OpenCV in the top right search barto check out any of the blogs we have written.

During the OpenCV live episode of the 10th of October, Arnaud and Kimberly told the backstories of these demos that went from a manual flight fail where Arnaud flew the Crazyflie 1.0 in Marcus’ hair, using OpenCV and Aruco markers for positioning to flying a swarm in your kitchen. It was really fun to do and alos one lucky listener managed to answer the two questions the host Phil asked at the end, namely “Where does the name Crazyflie come from?” and “Why is the last part (‘-flie’) spelled this way?” and won a STEM ranging bundle. If you’d like to know the answers, go and watch the latest OpenCV! Live episode ;) Enjoy!

ROSCon – What to expect?

So next week we will be present as Silver Sponsor at ROSCon Odense, namely on Monday 21th and Wednesday 23rd of October. The Bitcraze booth will be located on number 21 so that should be near the coffee break place! We will have are old trusty cage with some upgrades with a nice ROS demo which is similar to the one explained in this Crazyflie ROS tutorial we have written a while ago, but then the swarming variant of it. We also hope to show a Brushless Crazyflie Prototype, and a new camera deck prototype, along with anything else we can find lying around at our office :D.

Moreover, Arnaud will be given a presentation on the lighthouse positioning system, namely at Wednesday 23rd of October 14:40 (2:30 pm) called ‘The Lighthouse project: from Virtual Reality to Onboard Positioning for Robotics’. The lighthouse positioning system will also be the system that we will demo at our booth so if you’d like to see it for yourself, or perhaps (during downtime) hack around together with us, you are more than welcome to do so! Check out the Bitcraze ROSCon Eventpage for more details about our demo or the hardware we will show.

There is one thing that has driven both the hardware/software and our enthusiasm forward in the last 13 years, and that is making demos! Whether it’s a new piece of hardware/deck for the Crazyflie or the integration with an existing software framework, it doesn’t matter, but we have got to show it and, by all means… it needs to fly!

We have used fairs, conferences, and online meetings as perfect opportunities to push the capabilities of the little Crazyflie to the fullest. Of all the development goals we set, those self-made deadlines and over-ambitiousness have pushed both the hardware and software to the limit. In this blog post, we will take a look back at all of those demos we’ve done in the past and what we have learned from them.

2013 – 2017: Hacker and Developer Fairs

One of the very first conferences we were invited to was Devoxx in the UK. This was back in 2013, and we flew the Crazyflie (1) with an FPV camera over the actual crowd (blogpost, video), which was something we had already been working on for about half a year before showing it at the conference (blogpost, video). A year later, at Devoxx France (2014), they let us fly at the actual exhibition and over the booths, which showed much better quality (blogpost, video)! Not sure if they would still let us do this at fairs, but back then it was a bit of a wild west :D.

By the time the Crazyflie 2.0 was released, we started going to Makerfaires and even visited 3 of them, all in 2015! At the Makerfaire in the Bay Area (blogpost), New York, and Berlin (blogpost 1, blogpost 2), we prepared an external positioning system with the Kinect 2 and augmented reality markers (ArUco) (blogpost). That was one hectic year, and not without issues with the demo itself along the way (blogpost), but it showcased the Crazyflie and pushed the Crazyflie Python library and client to a more mature state.

Once 2016 came, the ultra-wideband positioning hacks reached a point where we could start demoing them as well. At first, the positioning was still calculated offboard with a ROS(1) node and transmitted to the Crazyflie, which was first showcased at Makerfaire Berlin 2016 (blogpost, video) at the booth itself. Eventually, a live demo was given at FOSDEM 2017 in the actual devroom for Embedded, Mobile, and Automotive (talk page). The Flowdeck was also in development at that time, and we had a small tabletop demo at Makerfaire Shenzhen 2017, where people could press a button, and the Crazyflie would take off, fly a circle, and land again (blogpost, video).

2017 – 2019: Academic Robotics Conferences

From 2017, we made it a habit to also meet with our research users, so we started going to academic robotics conferences as well, starting with ICRA 2017 in Singapore. Here, we showcased the Loco Positioning System, where the positioning was estimated onboard, so no external computer was required to perform the calculations (blogpost, video).

At IROS 2018, we took it up a notch by joining our collaborator Qualisys, showcasing the Loco Positioning System for a swarm, Motion Capture-based localization, and the brand new Lighthouse positioning prototype (blogpost 1, blogpost 2). We also added autonomous charging to it as well, so it was a great deal of work! Maybe we took on a bit too much, but one thing is for sure—we learned a lot by doing it (blogpost 1, blogpost 2, video)! With ICRA and IROS 2019, we perfected the circling swarm demo so that it was fully autonomous. However, this time we only used the Lighthouse positioning system since it was a bit easier to set up (blogpost 1, blogpost 2, video). The computer still had to command which Crazyflie to start flying, but other than that, we didn’t have to mind it that much and had plenty of time to talk with the users.

2020 – 2022: Covid and the Home Lab

As everyone knows—and probably tries to forget—2020 was the year that Covid hit us hard, and we couldn’t travel anywhere anymore. For us, it was quite an adjustment period, as we had to find another type of motivation to keep moving forward and continue development. We introduced the concept of the home lab and gave online talks and tutorials to still show cool stuff with the Crazyflie to the world (blogpost, video).

In 2020, we all joined together to work on the Hyper demo, which was a showcase that demonstrated the Crazyflie could fly with three positioning systems at the same time, enabling it to fly all the way from the meeting room to the flight arena (blogpost, video). We also celebrated Bitcraze’s 10-year anniversary with the BAM Days, a full 3-day online seminar about all things Crazyflie, for which we and our collaborators prepared a whole range of different demos, including a Rust-based app layer example and a peer-to-peer onboard swarming example (blogpost).

2022-now: Back to conferences

At the end of 2022, we managed to go to fairs again, namely IMAV and IROS 2022, where we showcased the fully autonomous swarm demo as before Covid hit. However, due to the demos we conducted during Covid, we also added full onboard peer-to-peer communication. This enabled the Crazyflies to negotiate which Crazyflie could take off, which pretty much completely eliminated the need for an external computer. Moreover, the Crazyflies communicated their positions to each other, which made it possible for them to avoid collisions on the fly (blogpost, video).

We have shown this demo as well for ICRA 2023 in London (blogpost) and ICRA 2024 in Yokohama (blogpost) with different variations and the upcoming brushless version as well (blogpost). The demo is quite robust, but it’s great to learn about the quality of the new motors and props, the guard prototypes of the Crazyflie Brushless, and the flight stability. But as you know us by now, it is time for something different!

Soon – ROSCon 2024

We have been to ROSCon before, back in 2022 (blogpost), but now we will be going to ROSCon 2024 for the first time as exhibitors (blogpost). ROS is a framework that is used by many researchers, including our users through Crazyswarm2, but ROSCon is more developer-oriented, and there will be more companies present that focus more on industry than academia. This time we won’t show our swarm demo as we usually do, but we will be showing demos more in line with what is presented in the ROS skill learning session of the robotics developer day (blogpost, video), but we will be hacking around on the spot! So this will be something new for us to try out, and we are very much looking forward to it!

Developer meeting, 9th of October 2024

This blog post only represents a subset of demos that we have done, but we will go into further detail at the next developer meeting on Wednesday, the 9th of October, at 3 PM CEST! Please join us to learn about all the great demos we have done in the past, get a glimpse of the history of Bitcraze, and discuss why demo-driven development is so important in moving your development forward.

Check for information on how to join the meeting here on discussions: https://github.com/orgs/bitcraze/discussions/1565

See you there!

As you might expect, we use the Crazyflie python client a lot at Bitcraze. The client has a lot of features, ranging from setting up LPS/Lighthouse systems to turning on/off the headlight LEDs on the LED-ring deck. But some of the features we use the most is probably the console view as well as the logging/parameter subsystems. A lot of the time we modify firmware, flash it and want to tweak (via parameters) or to check if the changes are working as expected (via the console or logging). Then switching from the terminal, where we build/flash, to the Qt UI in the Crazyflie python client can be a hassle if you just want to do something quick. It would be great to be able to log/set variables directly from the terminal, well now you can!

Meet the Crazyflie command-line client, a fun Friday project I worked on a while back. The CLI is written in Rust and was made possible thanks to a previous fun Friday project by Arnaud on the Rust Crazyflie link/lib, which has now moved to the official Bitcraze repositories. The CLI project is still very limited, but has some basic functionality:

  • Scan for Crazyflies and pre-select one to interact with
  • List loggable variables, create log configurations and print their value
  • List parameters and get/set them
  • Show the Crazyflie console

Last week the first version, v0.1.0, was released on crates.io. So if you have Rust set up on your computer and want to test it out then all you need to do is to type “cargo install cfcli“. The CLI still only has some basic functionality, but hopefully it can be expanded in the future with more useful things! Feel free to leave any issues or comments you might have on the Github page.

You might remember that at the beginning of this summer, we were invited to do a skill-learning session with the Crazyflie at the Robotics Developer Day 2024 (see this blog post) organized by The Construct. We showed the Crazyflie flying with the multi-ranger deck, capable of mapping the room in both simulation and the real world. Moreover, we demonstrated this with both manual control and autonomous wall-following. Since then, we wanted to make some improvements to the simulation. We now present an updated tutorial on how to do all of this yourself on your own machine.

This tutorial will focus on using the multi-ranger ROS 2 nodes for both mapping and wall-following in simulation first, before trying it out on the real thing. You will be able to tune settings to your specific environment in simulation first and then use exactly the same nodes in the real world. That is one of the main strengths of ROS, providing you with that flexibility.

We have made a video of what to expect of the tutorial, for which you should use this blogpost for the more detailed instructions.

Watch this video first and then again with the instructions below

What do you need first?

You’ll need to setup some things first on the PC and acquire hardware to follow this tutorial in full:

PC preparation

You’ll need to install ROS 2 and Gazebo simulator maintained by the Open Robotics foundation on an Ubuntu machine.

Hardware

You’ll need to components at least of the STEM ranging bundle

If you have any different setup of your computer, it is okay as the demos should be simple enough to work, but, be prepared for some warning/error handling that this tutorial might have not covered.

Time to complete:

This is an approximation of how much time you need to complete this tutorial, depended on your skill level, but if you already have experience with both ROS 2/Gazebo and the Crazyflie it should take 1 hour.

If you have the Crazyflie for the first time, it would probably be a good idea to go through the getting started tutorial and connect to it with a CFclient with the Flowdeck and Multi-ranger deck attached as a sanity check if everything is working before jumping into ROS 2 and Gazebo.

Some things holds for ROS 2! It would be handy to go through the ROS 2 Humble beginner tutorials before starting.

1. Installation

This section will install 4 packages:

Make the workspaces for both simulation and ROS. You can use a different directory for this

mkdir ~/crazyflie_mapping_demo
cd crazyflie_mapping_demo
mkdir simulation_ws
mkdir ros2_ws
cd ros2_ws
mkdir src

Let’s clone the repositories in their right location, starting with simulation

cd ~/crazyflie_mapping_demo/simulation_ws
git clone https://github.com/bitcraze/crazyflie-simulation.gitCode language: JavaScript (javascript)

Then navigate to the ROS2 workspace source folder and clone 3 projects:

cd ~/crazyflie_mapping_demo/ros2_ws/src
git clone https://github.com/knmcguire/crazyflie_ros2_multiranger.git
git clone https://github.com/knmcguire/ros_gz_crazyflie
git clone https://github.com/IMRCLab/crazyswarm2 --recursiveCode language: PHP (php)

First install certain requirements as apt-get packages and pip libraries (might want to make a python environment for the latter)

sudo apt-get install libboost-program-options-dev libusb-1.0-0-dev python3-colcon-common-extensions
sudo apt-get install ros-humble-motion-capture-tracking ros-humble-tf-transformations
sudo apt-get install ros-humble-ros-gzharmonic ros-humble-teleop-twist-keyboard
pip3 install cflib transform3D Code language: JavaScript (javascript)

Also follow the instructions to give the proper rights to the Crazyradio 2.0 in this guide, but if this is your first time of working with the Crazyradio 2.0 first follow this tutorial.

Go to the ros2_ws workspace and build the packages

cd  ~/crazyflie_mapping_demo/ros2_ws/
source /opt/ros/humble/setup.bash
colcon build --cmake-args -DBUILD_TESTING=ONCode language: JavaScript (javascript)

Building will take a few minutes. Especially Crazyswarm2 will show a lot of warnings and std_err, but unless the package build has ‘failed’, just ignore it for now until we have proposed a fix to that repository.

If the build of all the packages passes and non failed, please continue to the next step!

2. Simple mapping simulation

This section will explain how to create a simple 2D map of your environment using the multi-ranger. The ROS 2 package designed for this is specifically made for the multi-ranger, but it should be compatible with NAV2 if you’d like. However, for now, we’ll focus on a simple version without any localization inferred from the map.

Open up a terminal which needs to be sourced for both the gazebo model and the newly build ROS 2 packages:

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
export GZ_SIM_RESOURCE_PATH="/home/$USER/crazyflie_mapping_demo/simulation_ws/crazyflie-simulation/simulator_files/gazebo/"Code language: JavaScript (javascript)

First lets be safe and start with simulation. Startup the ROS 2 launch files with:

ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_simulation.launch.pyCode language: CSS (css)

If you get a ‘No such file or directory’ error on the model, try entering the full path in GZ_SIM_RESOURCE_PATH export.

Gazebo will start with the Crazyflie in the center. You can get a close-up of the Crazyflie by right-clicking it in the Entity tree and pressing ‘Move to’. You can also choose to follow it, but the camera tracking feature of Gazebo needs some tuning to track something as small as the Crazyflie. Additionally, you will see RVIZ starting with the map view and transforms preconfigured.

Open up another terminal, source the installed ROS 2 distro and open up the ROS 2 teleop keyboard node:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Have the Crazyflie take off with ‘t’ on your keyboard, and rotate it around with the teleop instructions. In RVIZ you should see the map being created and the transform of the Crazyflie moving. You should be able to see this picture, and in this part of the video.

Screenshot of the Crazyflie in Gazebo generating a map with Teleop (video)

3. Simple mapping real world

Now that you got the gist of it, let’s move to the real Crazyflie!

First, if you have a different URI of the Crazyflie to connect to, first change the config file ‘crazyflie_real_crazyswarm2.yaml’ in the crazyflie_ros2_repository. This is a file that Crazyswarm2 uses to know to which Crazyflie to connect to.

Open up the config file in gedit or your favorite IDE like visual code:

gedit ~/crazyflie_mapping_demo/ros2_ws/src/crazyflie_ros2_multiranger/crazyflie_ros2_multiranger_bringup/config/crazyflie_real_crazyswarm2.yamlCode language: JavaScript (javascript)

and change the URI on this line specifically to the URI of your Crazyflie if necessary. Mind that you need to rebuild ros2_ws again to make sure that this has an effect.

Now launch the ROS launch of the simple mapper example for the real world Crazyflie.

source ~/crazyflie_mapping_demo/ros2_ws/install/setup.bash
ros2 launch crazyflie_ros2_multiranger_bringup simple_mapper_real.launch.py
Code language: JavaScript (javascript)

Now open up another terminal, source ROS 2 and open up teleop:

source /opt/ros/humble/setup.bash
ros2 run teleop_twist_keyboard teleop_twist_keyboard

Same thing, have the Crazyflie take off with ‘t’, and control it with the instructions.

You should be able to see this on your screen, which you can also check with this part of the video.

Screen shot of the real Crazyflie mapping while being controlled with ROS 2 teleop (video)

Make the Crazyflie land again with ‘b’, and now you can close the ROS 2 node in the launch terminal with ctrl + c.

4. Wall following simulation

Previously, you needed to control the Crazyflie yourself to create the map, but what if you could let the Crazyflie do it on its own? The `crazyflie_ros2_multiranger` package includes a `crazyflie_ros2_multiranger_wall_following` node that uses laser ranges from the multi-ranger to perform autonomous wall-following. Then, you can just sit back and relax while the map is created for you!

Let’s first try it in simulation, so open up a terminal and source it if you haven’t already (see section of the Simple mapper simulation). Then launch the wall follower ROS 2 launch file:

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_simulation.launch.pyCode language: CSS (css)

Take off and wall following will go fully automatic. The simulated Crazyflie in Gazebo will fly forward, stop when it sees a wall with it’s forward range sensor and follow the wall on its left-hand side.

You’ll see on RVIZ2 when the full map is created like here below and this part of the tutorial video.

Screenshot of the simulated Crazyflie in Gazebo mapping will autonomously wall following (video)

You can stop the simulated Crazyflie by the following service call in another terminal that is sourced with ROS 2 humble.

ros2 service call /crazyflie/stop_wall_following std_srvs/srv/Trigger

The simulated Crazyflie will stop wall following and land. You can also just close the simulation, since nothing can happen here.

5. Wall following real world

Now that we have demonstrated that the wall-following works in simulation, we feel confident enough to try it in the real world this time! Make sure you have a fully charged battery, place the Crazyflie on the floor facing the direction you’d like the positive x-axis to be (which is also where it will fly first), and turn it on.

Make sure that you are flying with a room with clear defined walls and corners, or make something with cardboard such as a mini maze, but the current algorithm is optimized to just fly in a squarish room.

Source the ROS 2 workspace like previously and start up the wall follower launch file for the

ros2 launch crazyflie_ros2_multiranger_bringup wall_follower_mapper_real.launch.pyCode language: CSS (css)

Like the simulated Crazyflie, the real Crazyflie will take off automatically and automatically do wall following, so it is important that it is flying towards a wall. It should look like this screenshot, or you can check it with this part of the video.

The real crazyflie wall following autonomously while mapping the room (video).

Be careful here to not accidently run this script with the Crazyflie sitting on your desk!

If you’d like the Crazyflie to stop, don’t stop the ROS2 nodes with ctrl-c, since it will continue flying until crash. It’s not like simulation unfortunately where you can close the environment and nothing will happen. Instead, use the ROS 2 service made for this in a different terminal:

ros2 service call /crazyflie_real/stop_wall_following std_srvs/srv/Trigger

Similar the real Crazyflie will stop wall following and land. Now you can close the ROS 2 terminals and turn off the crazyflie.

Next steps?

We don’t have any more demos to show but we can give you a list of suggestions of what you could try next! You could for instance have multiple Crazyflies mapping together like in the video shown here:

This uses the mapMergeForMultiRobotMapping-ROS2 external project, which is combined with Crazyswarm2 with this launch file gist. Just keep in mind that, currently, it would be better to use a global positioning system here, such as the Lighthouse positioning system used in the video. Also, if you’d like to try this out in simulation, you’ll need to ensure different namespaces for the Crazyflies, which the current simulation setup may not fully support.

Another idea is to connect the NAV2 stack instead of the simple mapper. There exists a couple of instructions on the Crazyswarm2 ROS2 tutorials so you can use those as reference. Check out the video below here.

Moreover, if you are having difficulties setting up your computer, I’d like to remind you that the skill-learning session we conducted for Robotics Developer Day was entirely done using a ROSject provided by The Construct, which also allows direct connection with the Crazyflie. The only requirement is that you can run Crazyswarm2 on your local machine, but that should be feasible. See the video of the original Robotics Developer Day skill-learning session here:

The last thing to know is that the ROS 2 nodes in this tutorial are running ‘offboard,’ so not on the Crazyflies themselves. However, do check out the Micro-ROS examples for the Crazyflie by Eprosima whenever you have the time and would like to challenge yourself with embedded development.

That’s it, folks! If you are running into any issues with this tutorial or want to bounce some cool ideas to try yourself, start a discussion thread on https://discussions.bitcraze.io/.

Happy hacking!

We are excited to announce the release of our new PID Tuning Guide! This guide is designed to help users understand and apply the basics of PID tuning within our ecosystem, making it easier to achieve stable and responsive flight for your Crazyflie. This guide is particularly useful if you’ve modified your drone, such as adding expansion decks or changing its motor and/or propeller configuration. While our default tuning is designed to work in a wide range of situations and configurations, fine-tuning your PID settings can enhance performance for your specific setup and flight profile.

Interface with tuning toolbox and plotter displaying the roll angle setpoint and the roll angle state estimate.

What’s in the guide?

The guide covers essential topics, including:

  • Fundamental PID Concepts: Understand the role of Proportional, Integral, and Derivative parameters in controlling your Crazyflie’s movements.
  • Step-by-Step Instructions: Learn how to set up your software, and use cfclient for tuning.
  • Practical Tuning Tips: Get insights on adjusting PID gains, using the tuning toolbox, and conducting safe manual flight tests.

Why this guide is useful

Even though this guide focuses on the basics, it provides a solid foundation for anyone new to PID tuning. Whether you’re using the Crazyflie 2.1, Crazyflie 2.0, or a custom-built quadcopter with the Crazyflie Bolt, this guide will help you:

  • Understand how PID controllers work and why they are important.
  • Use the cfclient for PID tuning within our ecosystem.

Safety first

We prioritize safety in our guide. Always secure your quadcopter in a safe environment, use protective gear, and configure an emergency stop on your controller to ensure a safe tuning process.

Get started with PID tuning today!

Ready to improve your quadcopter’s flight performance? Check out our PID Tuning Guide and start tuning.

This week, we have a guest blog post from Scott at Droneblocks.

DroneBlocks is a cutting-edge platform that has transformed how educators worldwide enrich STEM programming in their classrooms. As pioneers in the EdTech space, DroneBlocks wrote the playbook on integrating drone technology into STEM curriculum for elementary, middle, and high schools, offering unparalleled resources for teaching everything from computer science to creative arts. What started as free block coding software and video tutorials has become a comprehensive suite of drone and robotics educational solutions. The Block-Coding software still remains free to all, as the DroneBlocks mission has always been to empower educators and students, allowing them to explore and lead the way. This open-source attitude set DroneBlocks on a mission to find the world’s best and most accessible micro-drone for education, and they found it in Sweden!

Previously, DroneBlocks had worked alongside drone juggernaut DJI and their Tello Drone. The Tello was a great tool for its time, but when DJI decided to discontinue it with little input from its partners and users, it made the break much easier. The hunt began for a DJI Tello replacement and an upgrade!

Bitcraze’s choice to build Crazyflie as an open platform had their drone buzzing wherever there was curiosity. The Crazyflie was developed to fly indoors, swarm, and be mechanically simplistic. DroneBlocks established that the ideal classroom micro-drone required similar characteristics. This micro-drone needed to be small for safety but sturdy for durability. It also needed to be easy to assemble and simple in structure for students new to drones. Most importantly, the ideal drone needed to have an open line of software communication to be fully programmable. Finally, there had to be an opportunity for a long-lasting partnership with the drone manufacturer, including government compliance.

After extensive searching and testing by DroneBlocks, the Crazyflie was a diamond in the rough – bite-sized and lightweight, supremely agile and accurate, reliable and robust, and most importantly, it was an open-source development platform. The DroneBlocks development team took the Crazyflie for a spin (or several) and with excitement, it was shared with the larger curriculum team to be mined for learning potential. It was promising to see Crazyflie’s involvement in university-level research studies, which proved it meant business. DroneBlocks knew the Crazyflie had a lot going for it – on its own. The team imagined how, when paired with DroneBlocks’ Block Coding software, Flight Simulator, and Curriculum Specialists, the Crazyflie could soar to atmospheric heights! 

Hardware? Check. Software? Check. But what about compatibility? DroneBlocks was immediately drawn to the open communication and ease of conversation with the Bitcraze team. It was obvious that both Bitcraze and DroneBlocks were born from a common thread and shared a mutual goal: to empower people to explore, investigate, innovate, research, and educate. 

DroneBlocks has since built a new Block Coding interface around the Crazyflie, allowing students to pilot their new drone autonomously and learn the basics of piloting and coding concepts. This interface is offered with a brand new drone coding simulator environment so students can test their code and fly the Crazyflie in a virtual classroom environment.

The Crazyflie curriculum currently consists of courses covering building, configuring, and finally, programming your drone with block coding (DroneBlocks) and Python. DroneBlocks’ expert curriculum team designed these courses to enable learners of all ages and levels to learn step by step through video series and exercises. New courses around block coding and Python are in constant development and will be continuously added to the DroneBlocks curriculum platform.

Crazyflie Drones now headline DroneBlocks’ premiere classroom launch kit. The DroneBlocks Autonomous Drones Level II kit encompasses everything a middle or high school would need to launch a STEM drone program, including the hardware, necessary accessories, and safety wear paired with the DroneBlocks software and curriculum. As a result, thousands of new students have entered the world of Drones and programming thanks to the Bitcraze + DroneBlocks partnership.

DroneBlocks has become an all-inclusive drone education partner for engaging and innovative learning experiences—and the Crazyflie delivers this by being a cutting-edge piece of hardware in a clever package.

Welcome to the “The Beginner’s Guide to Drones” for programmers curious about exploring the world of unmanned aerial vehicles (UAVs). If you’re a coder from another field, this guide will walk you through the basics of drones, their components, and how to start programming them. Let’s dive in and see how your coding skills can take flight!

If you come from an engineering field, you might already know the basics of some of these topics, however you might still have use of the overview and can use the resources to get more specific knowledge.

The Robotics part

First and foremost, you’ll need some basic robotics skills.
We start of with the most basic question, “What is a Robot?”. A robot uses sensors to create an internal model of its environment, and actuators to act on/in its environment. The specifics of the internal model depend on the robot’s purpose, but a crucial component is understanding its location and orientation within that environment.

Linear algebra basics

To understand how a quadcopter perceives its environment and its own position, you’ll need some basic skills in linear algebra, particularly in matrices, vectors, and frame rotation. These skills are essential for comprehending the mathematical principles behind quadcopter navigation.

To build an internal model of its own movement and orientation, an Inertial Measurement Unit (IMU) is used. An IMU consists of a gyroscope, an accelerometer, and sometimes a magnetometer. These sensors, when combined using sensor fusion techniques (See “Control Theroy” below), help determine the quadcopter’s angular velocity and linear acceleration. This data allows the drone to calculate its orientation and movement.

Picture of a Crazyflie in relation to a reference frame from Bitcraze.io

For a detailed understanding of these processes, please refer to the following resources:
Visual description of frame transformations;
https://www.youtube.com/watch?v=kYB8IZa5AuE
Basic coordinate transformations;
https://motion.cs.illinois.edu/RoboticSystems/CoordinateTransformations.htmlbasi

Positioning techniques

The quadcopter can now determine its relationship to its starting position and the gravitational field. However, relying solely on an IMU tends to cause drift over time. Imagine trying to stand on one leg with your eyes closed—eventually, you’ll lose balance.

For improved stability, a drone often needs additional sensors, such as a camera, to help stabilize its position. Other sensor systems can also be used to determine relative or absolute position. While an IMU can sense changes in position relative to a starting point, an external positioning system is necessary for stability and obtaining absolute positions. This system acts as a reference frame for the drone.

Drones flying outdoors typically use GPS combined with RTCM, since it is available almost anywhere, ease to use, and has centimeter-level accuracy.

For indoor use, as with Crazyflies, the default used positioning system is motion-capture system but there are others as well. This area is at the cutting edge of science, with new technologies emerging constantly. However, many effective systems are available, though they may have constraints regarding power efficiency, flight area size, update speed, or precision.

This is an image of some common techniques for positioning. a) AOA, b) TOA, TWR c) TDOA taken from Researchgate.com

Control theory

Now that drone can understand its position and orientation in space, the next step is figuring out how to move within this space. Moving from point A to point B involves setting a “setpoint” and then determining how to use the drone’s actuators to reach this setpoint most efficiently. This is where control theory comes into play.

Drones generally use some sort of feedback control system, which in its most basic form looks something like this:

A very basic overview of a system with feedback control

In this system, the error (the difference between the current position and the setpoint) is fed back into the system to ensure the drone moves in a way that minimizes the error over time.

Various algorithms can calculate the best actuator output based on the error and current state. One of the most fundamental algorithms is the PID controller, which works well for linear systems. Understanding PID controllers requires some basic calculus, but the concept is straightforward. Here are some resources for simple explanations:

For IMUs, there is a particularly useful filter to know about, given its widespread use. The accelerometer and gyroscope each have their own profiles of noise and drift. The accelerometer is sensitive to short-term noise, while the gyroscope drifts slowly over time. To mitigate these issues, a combination of both measurements is often used. The complementary filter is ideal for this situation, leveraging the strengths of both sensors to correct the measurements effectively.

More information on how to use the complementary filter for IMUs can be found here:
https://www.hackster.io/hibit/complementary-filter-and-relative-orientation-with-mpu9250-d4f79d

For more complex scenarios, advanced controllers like Kalman filters and others can be used. It’s also possible to combine multiple controllers to achieve better performance.

Basic overview of feedback control systems;
https://control.com/textbook/closed-loop-control/basic-feedback-control-principles/

The flying part

Now lets get into the exciting part. FLYING!!!

Actuators

Drone actuators, primarily consisting of motors and propellers, are critical for controlling a drone’s movement and stability. The motors and propellers are typically called the “drive train” or “power train”.
The motors used on drones are usually brushed or brushless DC motors. Propellers are attached to the motors and generate lift by pushing air downwards. The size, shape, and pitch of the propellers affect the drone’s performance, including speed, lift, and maneuverability. Together, the precise control of motors and propellers enables a drone to perform complex maneuvers, maintain stability, and achieve efficient flight.

Pictures of forces acting on a drone in flight


This is an almost all you need guide to get an overview of drone flight dynamics
https://dronstechnology.com/the-physics-of-drone-flight-lift-thrust-drag-and-weight/

Stock information

As of now the Crazyflie 2.1 is out of stock, unfortunately. They’re expected back in stock around August 20th – 4 weeks from now. You can sign up in our shop to be notified as soon as they arrive!
https://store.bitcraze.io/collections/kits/products/crazyflie-2-1?variant=19575412719703

Today we welcome Sam Schoedel and Khai Nguyen from Carnegie Mellon University. Enjoy!

We’re excited to share the research we’ve been doing on model-predictive control (MPC) for tiny robots! Our goal was to find a way to compress an MPC solver to a size that would fit on common microcontrollers like the Crazyflie’s STM32F405 while being fast enough to control the higher frequency dynamics of smaller robots. We came up with a few tricks to make that happen and dubbed the resulting solver TinyMPC. When it came time for hardware experiments, using the Crazyflie just made sense. A tiny solver deserves a tiny robot.

Motivation

Model predictive control is a powerful tool for controlling complex systems, but it is computationally expensive and thus often limited to use cases where the robot can either carry enough computational power or when offboard computing is available. The problem becomes challenging to solve for small robots, especially when we want to perform all of the computation onboard. Smaller robots have inherently faster dynamics which require higher frequency controllers to stabilize, and because of their size they don’t have the capacity to haul around as much computational power as their larger robot counterparts. The computers they can carry are often highly memory-constrained as well. Our question was “how can we shrink the computational complexity and memory costs of MPC down to the scale of tiny robots?”

What We Did

We settled on developing a convex model predictive control solver based on the alternating direction method of multipliers. Convex MPC solvers are limited to reasoning about linear dynamics (on top of any other convex constraints), but have structure that TinyMPC exploits to solve problems efficiently. The tricks we used to achieve this efficiency are described in the paper, but it boils down to rewriting the problem as a constrained linear-quadratic regulator to reduce the memory footprint and then precomputing as many matrices as possible offline so that online calculations are less expensive. These tricks allowed us to fit long-time horizon MPC problems on the Crazyflie and solve them fast enough for real-time use.

What TinyMPC Can Do

We decided to demonstrate the constraint-handling capabilities of TinyMPC by having the Crazyflie avoid a dynamic obstacle. We achieved this by re-computing hyperplane constraints (green planes in the first video) about a spherical obstacle (transparent white ball) for each knot point in the trajectory at every time step, and then by solving the problem with the new constraints assuming they stayed fixed for the duration of the solve.

In the two videos below, the reference trajectory used by the solver is just a hover position at the origin for every time step. Also, the path the robot takes in the real world will never be exactly the same as the trajectory computed by the solver, which can easily result in collisions. To avoid this, we inflated the end of the stick (and the simulated obstacle) to act as a keep-out region.

TinyMPC is restricted to reasoning about linear dynamics because of its convex formulation. However, a simple linearization can be taken pretty far. We experimented with recovering from different starting conditions to push the limits of our linear Crazyflie model and were able to successfully recover from a 90 degree angle while obeying the thrust commands for each motor.

We recently added support for second-order cone constraints as well. These types of constraints allow TinyMPC to reason about friction and thrust cones, for example, which means it can now intelligently control quadrupeds on slippery surfaces and land rockets. To clearly demonstrate the cone constraint, we took long exposure photos of the Crazyflie tracking a cylindrical landing trajectory without any cone constraints (red) and then with a spatial cone constraint that restricts the landing maneuver to a glide slope (blue).

How To Use TinyMPC

All of the information regarding the solver can be found on our website and GitHub org (which is where you can also find the main GitHub repository). TinyMPC currently has a Python wrapper that allows for validating the solver and generating C++ code to run on a robot, and we have a few examples in C++ if you don’t want to use Python. Our website also explains how to linearize your robot and has some examples for setting up the problem with a linear model, solving it an MPC loop, and then generating and running C++ code.

Most importantly to the Crazyflie community, our TinyMPC-integrated firmware is available and should work out of the box. Let us know if you use it and run into issues!

Our accompanying research papers:

Khai Nguyen, Sam Schoedel, Anoushka Alavilli, Brian Plancher, and Zachary Manchester. “TinyMPC: Model-Predictive Control on Resource-Constrained Microcontrollers.” arXiv preprint arXiv:2310.16985 (2023). https://arxiv.org/pdf/2310.16985

Sam Schoedel, Khai Nguyen, Elakhya Nedumaran, Brian Plancher, and Zachary Manchester. “Code Generation for Conic Model-Predictive Control on Microcontrollers with TinyMPC.” arXiv preprint arXiv:2403.18149 (2024). https://arxiv.org/pdf/2403.18149

We would love your feedback and suggestions, and let us know if you use TinyMPC for your tiny platforms!