research

Today we welcome Sam Schoedel and Khai Nguyen from Carnegie Mellon University. Enjoy!

We’re excited to share the research we’ve been doing on model-predictive control (MPC) for tiny robots! Our goal was to find a way to compress an MPC solver to a size that would fit on common microcontrollers like the Crazyflie’s STM32F405 while being fast enough to control the higher frequency dynamics of smaller robots. We came up with a few tricks to make that happen and dubbed the resulting solver TinyMPC. When it came time for hardware experiments, using the Crazyflie just made sense. A tiny solver deserves a tiny robot.

Motivation

Model predictive control is a powerful tool for controlling complex systems, but it is computationally expensive and thus often limited to use cases where the robot can either carry enough computational power or when offboard computing is available. The problem becomes challenging to solve for small robots, especially when we want to perform all of the computation onboard. Smaller robots have inherently faster dynamics which require higher frequency controllers to stabilize, and because of their size they don’t have the capacity to haul around as much computational power as their larger robot counterparts. The computers they can carry are often highly memory-constrained as well. Our question was “how can we shrink the computational complexity and memory costs of MPC down to the scale of tiny robots?”

What We Did

We settled on developing a convex model predictive control solver based on the alternating direction method of multipliers. Convex MPC solvers are limited to reasoning about linear dynamics (on top of any other convex constraints), but have structure that TinyMPC exploits to solve problems efficiently. The tricks we used to achieve this efficiency are described in the paper, but it boils down to rewriting the problem as a constrained linear-quadratic regulator to reduce the memory footprint and then precomputing as many matrices as possible offline so that online calculations are less expensive. These tricks allowed us to fit long-time horizon MPC problems on the Crazyflie and solve them fast enough for real-time use.

What TinyMPC Can Do

We decided to demonstrate the constraint-handling capabilities of TinyMPC by having the Crazyflie avoid a dynamic obstacle. We achieved this by re-computing hyperplane constraints (green planes in the first video) about a spherical obstacle (transparent white ball) for each knot point in the trajectory at every time step, and then by solving the problem with the new constraints assuming they stayed fixed for the duration of the solve.

In the two videos below, the reference trajectory used by the solver is just a hover position at the origin for every time step. Also, the path the robot takes in the real world will never be exactly the same as the trajectory computed by the solver, which can easily result in collisions. To avoid this, we inflated the end of the stick (and the simulated obstacle) to act as a keep-out region.

TinyMPC is restricted to reasoning about linear dynamics because of its convex formulation. However, a simple linearization can be taken pretty far. We experimented with recovering from different starting conditions to push the limits of our linear Crazyflie model and were able to successfully recover from a 90 degree angle while obeying the thrust commands for each motor.

We recently added support for second-order cone constraints as well. These types of constraints allow TinyMPC to reason about friction and thrust cones, for example, which means it can now intelligently control quadrupeds on slippery surfaces and land rockets. To clearly demonstrate the cone constraint, we took long exposure photos of the Crazyflie tracking a cylindrical landing trajectory without any cone constraints (red) and then with a spatial cone constraint that restricts the landing maneuver to a glide slope (blue).

How To Use TinyMPC

All of the information regarding the solver can be found on our website and GitHub org (which is where you can also find the main GitHub repository). TinyMPC currently has a Python wrapper that allows for validating the solver and generating C++ code to run on a robot, and we have a few examples in C++ if you don’t want to use Python. Our website also explains how to linearize your robot and has some examples for setting up the problem with a linear model, solving it an MPC loop, and then generating and running C++ code.

Most importantly to the Crazyflie community, our TinyMPC-integrated firmware is available and should work out of the box. Let us know if you use it and run into issues!

Our accompanying research papers:

Khai Nguyen, Sam Schoedel, Anoushka Alavilli, Brian Plancher, and Zachary Manchester. “TinyMPC: Model-Predictive Control on Resource-Constrained Microcontrollers.” arXiv preprint arXiv:2310.16985 (2023). https://arxiv.org/pdf/2310.16985

Sam Schoedel, Khai Nguyen, Elakhya Nedumaran, Brian Plancher, and Zachary Manchester. “Code Generation for Conic Model-Predictive Control on Microcontrollers with TinyMPC.” arXiv preprint arXiv:2403.18149 (2024). https://arxiv.org/pdf/2403.18149

We would love your feedback and suggestions, and let us know if you use TinyMPC for your tiny platforms!

This week’s guest blogpost is from Frederike Dümbgen presenting her latest work from her PhD project at the Laboratory of Audiovisual Communications (LCAV), EPFL, and is currently a Postdoc at the University of Toronto. Enjoy!

Bats navigate using sound. As a matter of fact, the ears of a bat are so much better developed than their eyes that bats cope better with being blindfolded than they cope with their ears being covered. It was precisely this experiment that helped the discovery of echolocation, which is the principle bats use to navigate [1]. Broadly speaking, in echolocation, bats emit ultrasonic chirps and listen for their echos to perceive their surroundings. Since its discovery in the 18th century, astonishing facts about this navigation system have been revealed — for instance, bats vary chirps depending on the task at hand: a chirp that’s good for locating prey might not be good for detecting obstacles and vice versa [2]. Depending on the characteristics of their reflected echos, bats can even classify certain objects — this ability helps them find, for instance, water sources [3]. Wouldn’t it be amazing to harvest these findings in building novel navigation systems for autonomous agents such as drones or cars?

Figure 1: Meet “Crazybat”: the Crazyflie equipped with our custom audio deck including 4 microphones, a buzzer, and a microcontroller. Together, they can be used for bat-like echolocation. The design files and firmware of the audio extension deck are openly available, as is a ROS2-based software stack for audio-based navigation. We hope that fellow researchers can use this as a starting point for further pushing the limits of audio-based navigation in robotics. More details can be found in [4].

The quest for the answer to this question led us — a group of researchers from the École Polytechnique Fédérale de Lausanne (EPFL) — to design the first audio extension deck for the Crazyflie drone, effectively turning it into a “Crazybat” (Figure 1). The Crazybat has four microphones, a simple piezo buzzer, and an additional microprocessor used to extract relevant information from audio data, to be sent to the main processor. All of these additional capabilities are provided by the audio extension deck, for which both the firmware and hardware design files are openly available.1

Video 1: Proof of concept of distance/angle estimation in a semi-static setup. The drone is moved using a stepper motor. More details can be found in [4].

In our paper on the system [4], we show how to use chirps to detect nearby obstacles such as glass walls. Difficult to detect using a laser or cameras, glass walls are excellent sound reflectors and thus a good candidate for audio-based navigation. We show in a first semi-static feasibility study that we can locate the glass wall with centimeter accuracy, even in the presence of loud propeller noise (Video 1). When moving to a flying drone and different kinds of reflectors, the problem becomes significantly more challenging: motion jitter, varying propeller noise and tight real-time constraints make the problem much harder to solve. Nevertheless, first experiments suggest that sound-based wall detection and avoidance is possible (Figure and Video 2).

Video 2: The “Crazybat” drone actively avoiding obstacles based on sound.
Figure 2: Qualitative results of sound-based wall localization on the flying “Crazybat” drone. More details can be found in [4].

The principle we use to make this work is sound-based interference. The sound will “bounce off” the wall, and the reflected and direct sound will interfere either constructively or destructively, depending on the frequency and distance to the wall. Using this same principle for the four microphones, both the angle and the distance of the closest wall can be estimated. This is however not the only way to navigate using sound; in fact, our software stack, available as an open-source package for ROS2, also allows the Crazybat to extract the phase differences of incoming sound at the four microphones, which can be used to determine the location of an external sound source. We believe that a truly intelligent Crazybat would be able to switch between different operating modes depending on the conditions, just like bats that change their chirps depending on the task at hand.

Note that the ROS2 software stack is not limited to the Crazybat only — we have isolated the hardware-dependent components so that the audio-based navigation algorithms can be ported to any platform. As an example, we include results on the small wheeled e-puck2 robot in [4], which shows better performance than the Crazybat thanks to the absence of propeller noise and motion jitter.

This research project has taught us many things, above all an even greater admiration for the abilities of bats! Dealing with sound is pretty hard and very different from other prevalent sensing modalities such as cameras or lasers. Nevertheless, we believe it is an interesting alternative for scenarios with poor eyesight, limited computing power or memory. We hope that other researchers will join us in the quest of exploiting audio for navigation, and we hope that the tools that we make publicly available — both the hardware and software stack — lower the entry barrier for new researchers. 

1 The audio extension deck works in a “plug-and-play” fashion like all other extension decks of the Crazyflie. It has been tested in combination with the flow deck, for stable flight in the absence of a more advanced localization system. The deck performs frequency analysis on incoming raw audio data from the 4 microphones, and sends the relevant information over to the Crazyflie drone where it is converted to the CRTP protocol on a custom driver and sent to the base station for further processing in the ROS2 stack.

References

[1] Galambos, Robert. “The Avoidance of Obstacles by Flying Bats: Spallanzani’s Ideas (1794) and Later Theories.” Isis 34, no. 2 (1942): 132–40. https://doi.org/10.1086/347764.

[2] Fenton, M. Brock, Alan D. Grinnell, Arthur N. Popper, and Richard R. Fay, eds. “Bat Bioacoustics.” In Springer Handbook of Auditory Research, 1992. https://doi.org/10.1007/978-1-4939-3527-7.

[3] Greif, Stefan, and Björn M Siemers. “Innate Recognition of Water Bodies in Echolocating Bats.” Nature Communications 1, no. 106 (2010): 1–6. https://doi.org/10.1038/ncomms1110.

[4] F. Dümbgen, A. Hoffet, M. Kolundžija, A. Scholefield and M. Vetterli, “Blind as a Bat: Audible Echolocation on Small Robots,” in IEEE Robotics and Automation Letters (Early Access), 2022. https://doi.org/10.1109/LRA.2022.3194669.

This week we have a guest blog post from Enrica Soria from the Laboratory of Intelligent Systems Faculty of Ecole Polytechnique Fédérale de Lausanne (EPFL) . Enjoy!

From Star Wars to Black Mirror, sci-fi movies predict a future where thousands of drones will fill our sky. Curving sharply around trees or soaring over buildings, they fly just like a flock of starlings. To turn this vision into a reality, real drone swarms need to increase their autonomy and operate in a decentralized fashion. In a decentralized swarm, each robot makes its own decision based only on local information. Decentralization not only allows the swarm to be more robust to the failure of single individuals, but also removes the dependency from a single computing unit, thus making the swarm more scalable in terms of size.

We at LIS (EPFL) have shown that predictive controllers can improve the safety of aerial swarms by predicting and optimizing the agents’ future behavior in an iterative process. However, the centralized nature of this method allowed us to only control five drones and prevented us from scaling up to a large number of drones. For this reason, we have worked on a novel decentralized and scalable swarm controller that allows the safe and cohesive flight of aerial swarms in cluttered environments. In our latest article, published in IEEE Robotics and Automation Letters (RA-L), we describe how we designed the controller, show its scalability in size, and demonstrate its robustness to noise. We studied the swarms’ performance and compared how it changes in two different environments: a forest and funnel-like environment.

The Crazyflie 2.1 was the perfect platform for our experiments. They are lightweight, modular, and tough. This quadcopter can survive big hits when things don’t go as planned… and, if you work on swarms, things can go wrong!

The fleet of Crazyflies equipped with a single marker.

With our algorithm, sixteen robots were able to fly through an artificial forest that we set up in our indoor motion capture arena. In our previous work, we installed four markers on each quadcopter and used the rigid body tracking from Motive (the Optitrack software). The large volume of our experimental room required the usage of big markers for long-distance detection, which added considerable weight to the drone. Hence, in our new work, we use a single marker per drone. Tracking is supported by the ‘crazyswarm’ package and communication with the entire swarm only requires two radio links. However, despite our model being decentralized, in our implementation robots relay the information to an external brain, which does the computations for them. In the future, all the necessary code will be embedded onboard, removing the dependency on external infrastructure.

Our predictive swarm of Crazyflies flying among obstacles in our indoor experimental room.
Video about the article

This work is a step forward towards the fully autonomous deployment of drone swarms in our cities. By enabling safe navigation in cluttered environments, drone fleets will be able to integrate with conventional air traffic, search for missing people, inspect dangerous areas, transport injured people to hospitals quicker, and deliver important packages right to our doors.

For further details, check out our article here!