Category: Guest-blogger

This early summer my research group (Center for Project-Based Learning at ETH Zürich) was in charge of a special week – high school students from all over Switzerland (actually even the world, they had to speak German though) could apply for a study week at different departments from our university. The departments which joined this initiative were mathematics, physics, biology, environmental sciences, material sciences, and our department, electrical engineering and information technologies (ITET). But how do you show teenagers between 15-19 in one week as much as possible from electrical engineering while also having fun? And best inspire them to study at ITET? Our solution was: drones. More specifically, Crazyflies. With those we had many possibilities to learn about electrical engineering – from sensors, microcontrollers, timers, and motors to LEDs, batteries, embedded systems, FreeRTOS tasks, state estimation, and controller – and all this with a high fun potential and a low risk of accidents, as with their weight of only 30g they hardly ever do any damage. In this blog post, I will guide you through our week, in hopes to help others who also want to use the Crazyflie to teach students about electrical engineering in a fun way.

Monday

We started in the afternoon (in the morning they had a welcoming tour) with a short introduction and splitting the 20 students into groups of two (everyone got a paper slip and had to find the matching one, accelerometer gyroscope, pitch roll, UART SPI, and so on – this gives the lecturer a great opportunity for interaction with the students later on, once their word gets relevant during the week). After a short introduction to programming and microcontrollers we moved on to the most classic beginner task: blink an LED! We chose to use the front left one, as this one is only used when communicating – so as long as we don’t connect to the drone we can observe exactly what we programmed. Most students got the LED turned on rather quickly – however, pulsing the LED to change the intensity took them some more time and forced them to learn how to write loops. They also already learned how PWM works without knowing it yet – setting the intensity of an LED or the strength of a motor is about the same thing in the end after all and this gave us a great start for Tuesday.

Tuesday

On Tuesday we looked at hardware from different perspectives. As you might have guessed, we looked at motors and how to control them with PWM and timers. The students were a bit disappointed that we still didn’t fly, but as soon as they realized that they could play their favorite song on the motors the motivation was high again! We didn’t even have any stray drones, even though we let them mount the propellers (the songs sound much better with propellers).
We also looked at another aspect of electrical engineering: PCB design (this was already done though) and soldering. For this, we prepared a custom deck, with four colored LEDs, which could be populated like in industry with solder paste and then soldered with a hot plate. To make it even more fun (and partly to show off our laser cutter) they also designed a small plastic diffuser that could be mounted on top. So in the end our setup resembles the LED ring – however, it can be mounted on top which is essential if you don’t want to fly with a positioning system (and therefore need to mount a flow deck).

Wednesday

Trying out the state estimation

Now that we knew how to blink LEDs and, even more important, how to control motors we had to learn a bit about how the drone actually figures out which motor should be turned on and how much. For this, we first looked at sensors and wired communication protocols, such as I2C (for the IMU and time-of-flight sensor), SPI (for the optical flow), and UART (to the nRF) – due to limited time we didn’t go into details here though. We briefly touched wireless communication, to explain how the commands they will later send to the drone (and the firmware) are actually sent to the right drone.

Tuning PIDs

We moved on to what state estimation is in general – again jumping over all details of an extended Kalman Filter, but had a closer look at the logging and parameter system. We then spent a bit more time on the PID controller again – which was also a bit hard to explain, as half the teenagers hadn’t learned how to integrate and differentiate yet. However, they learned fast and we could move to the part they waited for all week long: Flying (and tuning the PID).

Thursday

Transport drone

This was the day that was meant for creativity – the students could choose themselves which project they want to achieve. We proposed them some ideas, such as blinking LEDs depending on the height, flying through a gate (the challenge here is to filter the height measurements when the gate border is below the drone), steering the drone with the keyboard, soldering an own sensor on a break-out board, …

In the end, we saw many cool projects, from a song played as a canon on multiple motors to a transporter drone, flying successfully through the gate, doing a successful looping (unfortunately no successful landing yet…), racing against each other (possibly with disco-lights on the deck) and trials for reaching max speed in the hallway.

The hallway was very popular to fly, as the distance to accelerate was longer…
Practice on the race parkour

The most popular base project was to steer the drone with the keyboard – unfortunately (or fortunately? they sure had fun with it once it was running and they might have learned enough in the remaining week…), this was very easy after we showed them where Marcus’ script lives (here) and which 8 lines (170-177) they have to remove for it to work without an AI-deck (and don’t forget to adapt the URI)…

Friday

Quite an audience!
Flying was forbidden – but playing music with motors was not!

On Friday it was presentation day – in the morning they could still work on their projects, but in the afternoon all the 120 students (and most of their parents) came together in a huge lecture hall to present what they did during this week. And, as at a real conference, they had posters and their drones (which we, unfortunately, were not allowed to fly without a fireproof net… Will organize this next time) to show their projects to family, friends, and even random tourists (the entrance hall of the ETH main building is on many sightseeing tours).

At the end of the week I doubted the robustness of Crazyflies for a moment – however, Monday morning once I had peace and quiet once again I figured out what was wrong with all hardware which ended up on the “not working for unknown reasons” stack in less than an hour (and fixed almost all of it). Notes to all others and my future self for the next time I give 10 drones to 20 teenagers:

  • If you show them how to tune a PID, also explain that “persistent” means exactly what it says – if you mess up the PID values and persist them they will stay this way until you reset them, no matter how often you reflash the drone.
  • Explain how fragile connectors are and that you NEVER should pull at cables. Also, mention 10x more to be careful when plugging in decks. And radios.
  • Keep one “private” drone no one is allowed to mess with – it will help greatly to figure out if they only broke the flow deck connectors or something more serious (which actually never happened)
  • Doing only warm boots with setting individual addresses with the CLOAD_CMDS while flashing saved us a lot of trouble, randomly connecting to drones only happened once they discovered the app for the phone…

The coding tasks (and at least some minimal solutions) can be found on my fork: Tasks and solutions. They are kept short on purpose – we at the Center for Project-Based Learning believe in our name – we believe the most learning (and fun) happens when you rather freely explore what you can do with the basic tools you just learned.

P.S. For completeness – I cut out all the parts which really had nothing to do with Crazyflies, we also did lab tours in the high-voltage laboratory and the laboratory for optical communication – and of course had some social events with actual university students. As much as we like the Crazyflie, even we have to admit that the field of electrical engineering is even bigger than what we can show with those tiny drones ;)

Today, Lennart Bult from Emergent Swarns presents us with this project of a 24/7 swarming demo. Enjoy!

Over the last few months our team has been working on creating a 24/7 swarming demo. Initially tasked by Guido de Croon and Chris Verhoeven from TU Delft MAVLab and the TU Delft Robotics Institute, we set out to find our way within the Crazyflie ecosystem to gradually increase the size and capabilities of the swarm. In this article we will first talk about some of the work and methods that we used. After that, we will introduce the TU Delft Science Centre Swarming Lab and talk about some applications of swarming drones.

Developing the 24/7 swarm

The project started in February with the goal of creating a physical swarm capable of real-time collision avoidance with drones and static obstacles. We started out with three drones equipped with the Flow Deck, and by setting them up in a clever way we could perform the first collision avoidance and landing tests. We were impressed with the performance we got out of the Flow Deck, however, eventually, it is mostly a battle against the drift of the position estimate, that is, we could increase some of the margins on the collision avoidance only so far before we would either fly out of the test zone or collide with another drone. Luckily with short test flights, we were able to see some of the flaws in our algorithms and correct them before testing with the new setup.

Setup after the first expansion to eight drones.

After a few weeks of testing we got approved for the first swarm expansion, five more drones and a Lighthouse positioning setup. This is when we could do our first real tests with the collision avoidance algorithm, which, much to our own surprise, worked on the first try. This is also when we first posted a project update on LinkedIn. There were however a lot of bugs that still needed to be worked out, and a lot of system experience still to be gained. After flying for a bit longer we noticed that some of the drones would flip quite often, which is when we discovered that we needed the thrust upgrade to control the additional weight of the larger battery and charging deck.

For the charging setup we took inspiration from the Bitcraze IROS 2022 demo; we 3D printed sloped landing pads that we tape onto a wireless charger. After a few iterations we landed on a design that uses minimal printer resources and allows the Crazyflie to land a bit off-center. This last feature turned out to be quite useful considering the large amount of destabilizing airflow that is generated by 40 drones. After receiving the last order of drones we also expanded the charging setup, which at this point takes up quite a bit of floor space. There are some ideas to create a vertical landing pad stack, which would bring the additional challenge of missing the landing pad not being an option.

All 40 drones recharging before their next flight.

After prototyping the charging setup and building confidence with the initial setup, we were confident enough in our system capabilities to expand it to the point where a continuous demo of 5-8 drones is possible. Although the system integration of the previous expansion went without much trouble, we did encounter a few issues when expanding to 40 drones. The first issue of which was radio communication, we noticed that a delay in the radio communication would be present if we increased the update rate above a certain level for a specific number of drones per radio. The second issue we encountered were performance drops related to the violation of certain bounds in the collision avoidance algorithm. These two issues were very difficult to debug since it was not immediately obvious where the source of the issue was.

The third and last major issue was the increase in destabilizing airflow of 40 drones compared to 8. With 40 drones there is a noticeable breeze when you stand next to the drone cage, which is nice for summertime, but not so nice when drones need to land in a tight-packed configuration. To combat this issue there is a limit to the amount of drones that can land at the same time. There is also a minimum separation distance between two active landing pads, which reduced the severity of the induced turbulence. There are still ongoing efforts to increase the landing success rate, which is currently affected by drones running out of power during the landing procedure.

To control and monitor the swarm we designed a custom GUI, an impression of which you can see below. Although some of the buttons are still a work in progress, there are a lot of features that have already proven very useful, especially when testing a new feature.

V1 of the graphical user interface developed for the 24/7 swarm.

The code base that we created for the swarm will be largely open-sourced (only the collision avoidance will not be open-source) to provide researchers all around the world with the possibility to setup their own Crazyflie swarm for research. You can find the repository through this link. Note that the documentation and code base are still under development and might contain bugs/errors.

Human interaction

After creating all functionality to provide a continuously operating swarm demo, it was time to work on some of our stretch goals: 1. walking through the swarm whilst it is operating and 2. controlling the swarm using our arms. In the image below you can see an impression of precisely this functionality. The drones are following the operator’s gesture commands whilst performing live collision avoidance with an operator.

Team member Seppe directing the 40 drone swarm, see the full video here.

This demo requires multiple techniques and hardware elements working together to create a relatively low-latency, human-controlled swarm. We used a Kinect-like 3D sensor to perform human pose estimation, we subsequently used this data to create a dynamic obstacle in our collision avoidance software. An important element to consider here is the synchronization of the Lighthouse- and 3D sensor coordinate frames, i.e. without proper calibration the human will not be correctly positioned with respect to the drones and the drones will crash into the human. The interaction between the swarm control software and the human gesture commands also requires careful consideration, proper tuning is required to ensure a responsive system that is reliable and not too aggressive.

TUD Science Centre Swarming Lab

The next step in this project will be to set up the swarm at its new location, the TU Delft Science Centre. Here, the swarm will first and foremost be visible as a public demo, showcasing the capabilities of TU Delft state-of-the-art swarming research. There will also be a focus on developing the swarm as a research platform. This will allow TU Delft students and researchers to extend swarm functionalities and test their theory on a physical swarming system. Besides demos and academic research, there will also be worked on developing educational applications across the full educational board (primary school, high school and applied education). If you are interested in working on, or collaborating with the swarming lab on any of the above-mentioned tasks, feel free to email the lab management at operations.swarminglab@tudelft.nl.

The TU Delft Swarming Lab setup with 40 drones and charging pads for continuous operations and research.

Applications of Swarming

There are a lot of potential use-cases for fully autonomous drone swarms, ranging from indoor applications such as warehouse monitoring and factory inspection to outdoor applications such as search and rescue and surveillance. In our opinion, the true potential of drone swarms lies in applications where there is a significant need for a scalable system with a lot of built-in redundancy. A lot of additional use cases open up when we consider fully onboard autonomous systems, where the full benefits of decentralized swarming can be utilized. Currently, the size of drones needed to achieve such feats is quite large, though maybe in a few years, we could see more and more being done on drone platforms such as the Crazyflie.

A swarm inspection of an F-16 Fighting Falcon at Deltion College in Zwolle, the Netherlands.

An interesting area of application for drone swarms could be in the inspection of aircraft. Drone swarms provide a scalable and flexible means to perform a fast inspection of aircraft across an entire airfield or military base. To showcase that this can be done with any size of drone, we went to Deltion College in Zwolle to perform a mock inspection of an F-16 fighter jet. Above you can see an impression of the inspection. Another area of application is search and rescue, where there is a need for systems that can find people or objects of interest in unknown and cluttered environments. Furthermore, the area that needs to be searched is usually very large and sometimes difficult to travel on foot. A drone swarm could provide fast and reliable coverage of the area of interest, whilst providing full data traceability. Seppe and Lennart will work on creating drone swarms for these use cases with the start-up Emergent Swarms.

Today, our guest Airi Lampinen from Stockholm University is presenting the second Drone Arena Challenge. Enjoy!

Welcome to the second Drone Arena Challenge, a one-of-a-kind interactive experience with Bitcraze’s Crazyflie! This year, the challenge is focused on moving together with drones in beautiful, curious, and provocative ways – without needing to write a single line of code!

Moving with drones. Image credit: Rachael Garrett.

What, when, where? The event takes place May 16-17, 2023 at KTH’s Reactor Hall in Stockholm – a dismantled nuclear reactor hall – which provides a unique setting for creative human–drone encounters. You don’t need your own drone or be able to program a drone to participate! We will provide the drone equipment (a Crazyflie 2.1 equipped with the AI-deck) and take care of everything necessary to make them fly. What you need to do is to be creative and move together with the drones to set up the best show you can deliver! There’ll be a jury judging the final performances and we have exciting prizes for the most successful teams!

Drone Arena in the Reactor Hall. Picture from the first challenge, held in June 2022. Picture credit: Fatemeh Bakhshoudeh

Who can join? Anyone irrespective of training, profession, and past experience with drones or performing arts is welcome to participate. Participants need to be at least 18 years old. If you are curious about how technology and humans may play together, enthusiastic about the Crazyflie, or eager to learn how to move with the Crazyflie, this event is for you. We welcome up to 10 pairs (teams of 2 people) to participate in the challenge.

Registration is already open, with only a few spots remaining. We encourage those interested to sign up as soon as possible to secure their spot!

Program & prizes? On the first day of the hackathon, we will host a keynote speaker and a short information session to explain what participants are expected to do and what support is available for them. The teams will then have access to the Reactor Hall to work on the challenge and explore moving with their drone – we offer long hours but each team is free to choose how much they want to work. (The goal here is to have a good time!) The competition itself takes place on the second day. We’ve got exciting prizes for the most successful teams!

Read more about the challenge, the prizes, and how to sign up on our website: http://www.dronearena.info/

The event is organized as a part of the Digital Futures demonstrator project Digital Futures Drone Arena led by Luca Mottola from RISE and Airi Lampinen from Stockholm University.

This week’s guest blogpost is from Matěj Karásek from Flapper Drones, about flying the Nimble + with a positioning system. Enjoy!

Flapper Drones are bioinspired robots flying by flapping their wings, similar to insects and hummingbirds. If you haven’t heard of Flappers yet, you can read more about their origins at TU Delft and about how they function in an earlier post and on our company website.

In this blogpost, I will write about how to fly the Flappers (namely the Flapper Nimble+) autonomously within a positioning system such as the Lighthouse, and will of course include some nice videos as well.

The Flapper Nimble+ is the first hover-capable flapping-wing drone on the market. It is a development platform powered by the Crazyflie Bolt and so it can enjoy most of the perks of the Crazyflie ecosystem, including the positioning systems as well as other sensors (check this overview). If you would like to get a Flapper yourself, just head to the Bitcraze webstore, where there are some units ready to be shipped! (At the time of writing at least…)

Minimal setup

The minimal setup for flying in a positioning system is nearly identical as with a standard Crazyflie. Next to a Flapper with a recent firmware, a Crazyradio dongle, a positioning system (in this post we will use the Lighthouse), and a compatible positioning deck (Lighthouse deck) you will also need: 1) a mount, such that the deck can be attached on top of the Flapper, and 2) a set of extension cables. You can 3D print the mounts yourself (models here), the extension cable prototypes can either be inquired from Flapper Drones, or can be soldered by yourself (in that case, the battery holder deck, standard Crazyflie pin headers and some wires come handy). Just pay attention to connect the cables in the correct way, as if the deck was mounted right on top of the Bolt. The complete setup with the Lighthouse deck will look like this:

Lighthouse deck installation on a Flapper Nimble+. Make sure the extension cables are well secured (e.g. by using the additional cable mount) such they don’t get caught by the gears.

For the Lighthouse, as with regular Crazyflies, the minimum number of base stations (with some redundancy) is 2, but you will get larger tracking volume with more base stations. 4 base stations mounted at 3 m height will give you about 5 meters time 5 meters coverage, which is recommended especially if you want to fly more than 1 Flapper at a time (they are a bit larger than the Crazyflies, after all…).
From now on, it is exactly the same as with standard Crazyflies. After you calibrate the Lighthouse system using the standard wizard procedure via the Cfclient, you can just go to the Flight Control Tab and use the “Command Based Flight Control” buttons to take-off, command steps in xyz directions and land. It is this easy!

Flapper Nimble+ in Lighthouse flown via Command Based Flight Control of cfclient

Assisted flight demo

We used this setup in February for the demos we were giving at the Highlight Delft festival in the Netherlands. This allowed people with no drone piloting skills (from 3-year-olds, to grandmas – true story) fly and control the Flapper in a safe way (safe for the Flapper, as the Flapper itself is a very safe platform thanks to its soft wings and low weight). To make it more fun, and even safer for the Flapper, we used a gamepad instead of on screen buttons, and we modified the cfclient slightly such that the flight space can be geofenced to stay within the tracking volume.

Flight demo at Highlight Delft festival, using the Lighthouse and position hold assistance

If you would like to try it yourself (it works also with standard crazyflies), the source code is here (just keep in mind it is experimental and has some known bugs…). To fly in the position-assisted mode, you need to press (and keep pressing) the Alt 1 button, and use the joysticks to move around (velocity commands, headless mode). Releasing the Alt 1 button will make the Flapper autoland. Autoland will also get triggered when the battery is low. You can still fly the Flapper in a direct way when pressing Alt 2 instead.

Flying more Flappers at a time

Again, this is something that works pretty much out of the box. As with a regular crazyflie, you just need to assign a unique address to each of the Flappers and then use e.g. this example python script to run a preprogrammed sequence.

With a few extra lines of code, we pulled this quick demo at the end of the Highlight Delft festival, when we had 30 minutes left before packing everything (one of the Flappers decided to drop its landing gear, probably too tired after 3 evenings of almost continuous flying…):

Sequence with 3 Flappers within Lighthouse positioning system

Other positioning systems

Using other positioning systems is equally easy. In fact, for the Loco Positioning system, the deck can even be installed directly on the Flapper’s Bolt board (no extension cables or mounts are needed). As for optical motion tracking, we do not have experience with Qualisys and the active marker deck, but flying with retro-reflective markers within OptiTrack system can be setup easily with just a few hacks.

When choosing and setting up the positioning system, just keep in mind that due to its wings, the Flapper needs to tilt much more to fly forward or sideways, compared to a quadcopter. This is not an issue with the Loco Positioning system (but there can be challenges with position estimation, as described further), but it can be a limitation for systems requiring direct line of sight, such as the Lighthouse or optical motion tracking.

Ongoing work

In terms of control and flight dynamics, the Flapper is very different from the Crazyflie. Thus, for autonomous flight, there remains room for improvement on the firmware side. We managed to include the “flapper” platform into the standard Crazyflie firmware (in master branch since November 2022, and in all releases since then), such that RC flying and other basic functionality works out of the box. However, as many things in the firmware were originally written only for a (specific) quadcopter platform, the Crazyflie 2.x, further contributions are needed to unlock the full potential of the Flapper.

With the introduction of “platforms” last year, many things can be defined per platform (e.g. the PID controller gains, sensor alignment, filter settings, etc.), but e.g. the Extended Kalman filter, and specifically the motion model inside, has been derived and tuned for the Crazyflie 2.x, and is thus no representative of the Flapper with very different flight dynamics. This is what directly affects (and currently limits) the autonomous flight within positioning systems – it works well enough at hover and slow flight, but the agility and speed achievable in RC flight cannot be reached yet. We are planning to improve this in the future (hopefully with the help of the community). The recently introduced out of tree controllers and estimators might be the way to go… To be continued :)

Thanks Matej ! And for those of you at home, don’t forget that we have our dev meeting next Wednesday (the 5th), where we’ll discuss about the Loco positioning system, but also will take some time for general discussions. We hope to see you there!

This week’s guest blogpost is from Florian Goralsky from Bok o Bok about their dance piece with multiple Crazyflies. Enjoy!

Flying bodies across the fields is a contemporary dance piece for four performers and a swarm of drones, exploring the phenomenon of the disappearance of bees and the use of pollinating drones to compensate for this loss. The piece attempts to answer this crucial question in a poetical way: can the machine create life and save us from ecological disaster?

Novembre Numérique à l’IFCI © M studio

We’re super excited to talk about a performance that we’ve been working on for the past two years in collaboration with Bitcraze. It premiered at the Environmental Forum, Centre Pompidou Paris, in 2021, and we’ve had the opportunity to showcase it at different venues since then. We are happy to share our thoughts about it!

Choreographic research

Beyond symbolizing current attempts to use drones to pollinate fields, the presence of the Crazyflie drones, supports the back and forth between nature and technology. We integrate a swarm, performing complex choreographies, which refer to the functioning of a beehive, including the famous “bee dance”, discovered by Karl von Frisch, which is used to transmit information on the food sources. Far from having a spectacular performance as its only goal, the synchronization of autonomous drones highlights bio-inspired computer techniques, focused on collective intelligence.

© bok o bok

Challenges within a dance performance

Making a dance performance with drones needs a high accuracy and adaptability, both before and during the show. Usually, we only have a few hours, sometimes even a few minutes, to setup the system according to the space. We quickly realized we needed pre-recorded choreographies, and hybrid choreographies where the pilot could have a few degrees of freedom on pre-defined behaviors.

GUI Editor + Python Server

Taking this into account, we developed a web GUI editor, that is able to send choreographies created with any device to a Websocket Python server. The system supports any absolute positioning system (We use the Lighthouse), and then converts all the setpoints and actions to the Crazyflie API HighLevelCommander class. This system allows us to create, update, and test complex choreographies in a few minutes on various devices.

Preview position of six drones at a certain time.
Early support of the CompressedTrajectories format, with Cubic bezier curves.

What is next?

We are looking forward to developing more dancers-drones interactions in the future. It will imply, in addition to the Lighthouse system, other sensors, in order to open up new possibilities: realtime path-finding, obstacle avoidance even during a recorded choreography (to allow improvisation), etc.

Novembre Numérique à l’IFCI © M studio

This week’s guest blogpost is from Frederike Dümbgen presenting her latest work from her PhD project at the Laboratory of Audiovisual Communications (LCAV), EPFL, and is currently a Postdoc at the University of Toronto. Enjoy!

Bats navigate using sound. As a matter of fact, the ears of a bat are so much better developed than their eyes that bats cope better with being blindfolded than they cope with their ears being covered. It was precisely this experiment that helped the discovery of echolocation, which is the principle bats use to navigate [1]. Broadly speaking, in echolocation, bats emit ultrasonic chirps and listen for their echos to perceive their surroundings. Since its discovery in the 18th century, astonishing facts about this navigation system have been revealed — for instance, bats vary chirps depending on the task at hand: a chirp that’s good for locating prey might not be good for detecting obstacles and vice versa [2]. Depending on the characteristics of their reflected echos, bats can even classify certain objects — this ability helps them find, for instance, water sources [3]. Wouldn’t it be amazing to harvest these findings in building novel navigation systems for autonomous agents such as drones or cars?

Figure 1: Meet “Crazybat”: the Crazyflie equipped with our custom audio deck including 4 microphones, a buzzer, and a microcontroller. Together, they can be used for bat-like echolocation. The design files and firmware of the audio extension deck are openly available, as is a ROS2-based software stack for audio-based navigation. We hope that fellow researchers can use this as a starting point for further pushing the limits of audio-based navigation in robotics. More details can be found in [4].

The quest for the answer to this question led us — a group of researchers from the École Polytechnique Fédérale de Lausanne (EPFL) — to design the first audio extension deck for the Crazyflie drone, effectively turning it into a “Crazybat” (Figure 1). The Crazybat has four microphones, a simple piezo buzzer, and an additional microprocessor used to extract relevant information from audio data, to be sent to the main processor. All of these additional capabilities are provided by the audio extension deck, for which both the firmware and hardware design files are openly available.1

Video 1: Proof of concept of distance/angle estimation in a semi-static setup. The drone is moved using a stepper motor. More details can be found in [4].

In our paper on the system [4], we show how to use chirps to detect nearby obstacles such as glass walls. Difficult to detect using a laser or cameras, glass walls are excellent sound reflectors and thus a good candidate for audio-based navigation. We show in a first semi-static feasibility study that we can locate the glass wall with centimeter accuracy, even in the presence of loud propeller noise (Video 1). When moving to a flying drone and different kinds of reflectors, the problem becomes significantly more challenging: motion jitter, varying propeller noise and tight real-time constraints make the problem much harder to solve. Nevertheless, first experiments suggest that sound-based wall detection and avoidance is possible (Figure and Video 2).

Video 2: The “Crazybat” drone actively avoiding obstacles based on sound.
Figure 2: Qualitative results of sound-based wall localization on the flying “Crazybat” drone. More details can be found in [4].

The principle we use to make this work is sound-based interference. The sound will “bounce off” the wall, and the reflected and direct sound will interfere either constructively or destructively, depending on the frequency and distance to the wall. Using this same principle for the four microphones, both the angle and the distance of the closest wall can be estimated. This is however not the only way to navigate using sound; in fact, our software stack, available as an open-source package for ROS2, also allows the Crazybat to extract the phase differences of incoming sound at the four microphones, which can be used to determine the location of an external sound source. We believe that a truly intelligent Crazybat would be able to switch between different operating modes depending on the conditions, just like bats that change their chirps depending on the task at hand.

Note that the ROS2 software stack is not limited to the Crazybat only — we have isolated the hardware-dependent components so that the audio-based navigation algorithms can be ported to any platform. As an example, we include results on the small wheeled e-puck2 robot in [4], which shows better performance than the Crazybat thanks to the absence of propeller noise and motion jitter.

This research project has taught us many things, above all an even greater admiration for the abilities of bats! Dealing with sound is pretty hard and very different from other prevalent sensing modalities such as cameras or lasers. Nevertheless, we believe it is an interesting alternative for scenarios with poor eyesight, limited computing power or memory. We hope that other researchers will join us in the quest of exploiting audio for navigation, and we hope that the tools that we make publicly available — both the hardware and software stack — lower the entry barrier for new researchers. 

1 The audio extension deck works in a “plug-and-play” fashion like all other extension decks of the Crazyflie. It has been tested in combination with the flow deck, for stable flight in the absence of a more advanced localization system. The deck performs frequency analysis on incoming raw audio data from the 4 microphones, and sends the relevant information over to the Crazyflie drone where it is converted to the CRTP protocol on a custom driver and sent to the base station for further processing in the ROS2 stack.

References

[1] Galambos, Robert. “The Avoidance of Obstacles by Flying Bats: Spallanzani’s Ideas (1794) and Later Theories.” Isis 34, no. 2 (1942): 132–40. https://doi.org/10.1086/347764.

[2] Fenton, M. Brock, Alan D. Grinnell, Arthur N. Popper, and Richard R. Fay, eds. “Bat Bioacoustics.” In Springer Handbook of Auditory Research, 1992. https://doi.org/10.1007/978-1-4939-3527-7.

[3] Greif, Stefan, and Björn M Siemers. “Innate Recognition of Water Bodies in Echolocating Bats.” Nature Communications 1, no. 106 (2010): 1–6. https://doi.org/10.1038/ncomms1110.

[4] F. Dümbgen, A. Hoffet, M. Kolundžija, A. Scholefield and M. Vetterli, “Blind as a Bat: Audible Echolocation on Small Robots,” in IEEE Robotics and Automation Letters (Early Access), 2022. https://doi.org/10.1109/LRA.2022.3194669.

This week’s guest blogpost is from Rik Bouwmeester from the Micro Air Vehicle lab, Faculty of Aerospace Engineering at the Delft University of Technology.

Tiny quadcopters like the Crazyflie can be operated in narrow, cluttered environments and in proximity to humans, making them the perfect candidate for search-and-rescue operations, monitoring of crop in a greenhouse, or performing inspections where other flying robots cannot reach. All these applications benefit from autonomy, allowing deployment without proximity to a base station or human operator and permitting swarming behavior.

Achieving autonomous navigation on nano quadcopters is challenging given the highly constrained payload and computational power of the platform. Most attention has been given to monocular solutions; the camera is a lightweight and energy-efficient passive sensor that captures rich information of the environment. One of the most important monocular visual cues is optical flow, which has been exploited on MAVs with higher payload for obstacle avoidance [1], depth estimation [2] and several bio-inspired methods for autonomous navigation [3–7].

Optical flow describes the apparent visual variations caused by relative motion between an observer and their surroundings. This rich visual cue contains tangled information of velocity and depth. However, calculating optical flow is expensive. The field of optical flow estimation is and has been for a couple of years dominated by convolutional neutral networks (CNNs). Despite efforts to find architectures of reduced size and latency [8-10], these methods are still highly computationally expensive, running at several to tens of FPS on modern desktop GPUs and requiring millions of parameters to run, rendering them incompatible with edge hardware.

To this end, we present “NanoFlowNet: Real-Time Dense Optical Flow on a Nano Quadcopter”, submitted to an international robotics conference, which introduces NanoFlowNet, a CNN architecture designed for real-time, fully on-board, dense optical flow estimation on the AI-deck.

CNN architecture

We adopt semantic segmentation CNN STDC-Seg [11] and modify it for optical flow estimation. The resulting CNN architecture may be considered “real-time” on desktop hardware, for deployment on edge devices such as a nano quadcopter the net must be significantly shrunk. We improve the latency of the architecture in three ways.

First, we redesign the key convolutional modules of the architecture, the Short-Term Dense Concatenate (STDC) module. By reordering the operations within the strided variant of the module, we save, depending on the location of the module within the architecture, from over 10% to over 50% of the MAC operations per module, while increasing the number of output filters with large receptive field size. A large receptive field size is desirable for optical flow estimation.

Second, inspired by MobileNets [12], we globally replace ‘regular’ convolutions with depthwise separable convolutions. Depthwise separable convolutions factorize a convolution into a depthwise and pointwise convolution, effectively reducing the calculational expense at a cost in representational capacity.

Third, we reduce the input dimensionality. We train and infer network on grayscale input images, reducing the required on-board memory for storing images by a factor 2/3. Any memory saved on the AI-deck’s L2 memory can be handed to AutoTiler for storing the CNN architecture, speeding up the on-board execution. Requiring more of a speed-up, we run the CNN on-board at a reduced input resolution of 160×112 pixels. Besides the speed-up through saved L2, reducing the input resolution makes all operations throughout the network cheaper. We downscale training data to closely match the target resolution. Both these changes come at a loss of input information. We will miss out on small objects and small displacements that are not captured by the resolution.

To give some intuition of the available memory: Estimating optical flow requires two input images. Storing two color input images at full resolution requires (2 x 324x324x3=) 630 kB. The AI-deck has 512 kB of L2 memory available.

Motion boundary detail guidance

Inspired by STDC-Seg, we guide the training of optical flow with a train-time-only auxiliary task to promote the encoding of spatial information in the early layers. Specifically, we introduce a motion boundary prediction task to the net. The motion boundary ground truth can be found in the optical flow datasets. This improves performance by 0.5 EPE on the MPI Sintel clean (train) benchmark, at zero cost to inference latency.

Performance on MPI Sintel

Given the scaling and conversion to grayscale of input data, our network is not directly comparable with results reported by other works. For comparison, we retrain one of the fastest networks in literature, Flownet2-s [13], on the same data. Given the reduction in resolution, we drop the deepest two layers to maintain a reasonable feature size. We name the model Flownet2-xs.

We benchmark the performance of the architecture on the optical flow dataset MPI Sintel. NanoFlowNet performs better than FlowNet2-xs, despite using less than 10% of the parameters. NanoFlowNet achieves 5.57 FPS on the AI-deck. FlowNet2-xs does not fit on the AI-deck due to the network size. To put the achieved latency of NanoFlowNet in perspective, we execute FlowNet2-xs’ first two convolutions and the final prediction layer on the GAP8. The three-layer architecture achieves 4.96 FPS, which is slower than running the entire NanoFlowNet. On a laptop GPU, the two architectures accomplish similar latency.

MethodMPI Sintel (train) [EPE]Frame rate [FPS]Parameters
CleanFinalGPUGAP8
FlowNet2-xs9.0549.4581501,978,250
NanoFlowNet7.1227.9791415.57170,881
Performance on MPI Sintel (train subset). (Average) end-to-end Point Error (EPE) describes how far off the estimated flow vectors are on average, lower is better.

Obstacle avoidance implementation

We demonstrate the effectiveness of NanoFlowNet by implementing it in a simple, proof-of-concept obstacle avoidance application on an AI-deck equipped Crazyflie. We let the quadcopter fly forward at constant velocity and implement the horizontal balance strategy [14], [15], where the quadcopter balances the optical flow in the left and right half plane by yawing.

We equip a Crazyflie with the Flow deck for positioning only. The total flight platform weighs 34 grams.

We augment the balance strategy by implementing active oscillations (a cyclic up-down movement), resulting in additional optical flow generated across the field of view. This is particularly helpful for avoiding obstacles in the direction of horizontal travel, since no optical flow is generated at the focus of expansion.

The obstacle avoidance implementation is demonstrated in an open and a cluttered environment in ‘the Cyber Zoo’, an indoor flight arena at the faculty of Aerospace Engineering at the Delft University of Technology. The control algorithm is most robust in the open environment, with the quadcopter managing to drain a full battery without crashing. In the cluttered environment, performance is more variable. Especially on occasions where obstacles are close to one another, the quadcopter tends to avoid the first obstacle successfully, only to turn straight into the second and crash into it. Adding a head-on collision detection based on FOE detection and divergence estimation (e.g., [7]) should help avoid obstacles in these cases.

Successful run in a cluttered environment in the Cyber Zoo. The Crazyflie manages to avoid collision until the battery is drained.

All in all, we consider the result a successful demonstration of the optical flow CNN. In future work, we expect to see applications that take more advantage of the resolution of the flow information.

Citation

Bouwmeester, R. J., Paredes-Vallés, F., De Croon, G. C. H. E. (2022). NanoFlowNet: Real-time Dense Optical Flow on a Nano Quadcopter. arXiv. https://doi.org/10.48550/arXiv.2209.06918

References

[1] Gao, P., Zhang, D., Fang, Q., & Jin, S. (2017). Obstacle avoidance for micro quadrotor based on optical flow. Proceedings of the 29th Chinese Control and Decision Conference, CCDC 2017, 4033–4037. https://doi.org/10.1109/CCDC.2017.7979206

[2] Sanket, N. J., Singh, C. D., Ganguly, K., Fermuller, C., & Aloimonos, Y. (2018). GapFlyt: Active vision based minimalist structure-less gap detection for quadrotor flight. IEEE Robotics and Automation Letters, 3(4), 2799–2806. https://doi.org/10.1109/LRA.2018.2843445

[3] Conroy, J., Gremillion, G., Ranganathan, B., & Humbert, J. S. (2009). Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Autonomous Robots, 27(3), 189–198. https://doi.org/10.1007/s10514-009-9140-0

[4] Zingg, S., Scaramuzza, D., Weiss, S., & Siegwart, R. (2010). MAV navigation through indoor corridors using optical flow. Proceedings – IEEE International Conference on Robotics and Automation, 3361–3368. https://doi.org/10.1109/ROBOT.2010.5509777

[5] De Croon, G. C. H. E. (2016). Monocular distance estimation with optical flow maneuvers and efference copies: A stability-based strategy. Bioinspiration and Biomimetics, 11(1). https://doi.org/10.1088/1748-3190/11/1/016004

[6] Serres, J. R., & Ruffier, F. (2017). Optic flow-based collision-free strategies: From insects to robots. Arthropod Structure and Development, 46(5), 703–717. https://doi.org/10.1016/j.asd.2017.06.003

[7] De Croon, G. C. H. E., De Wagter, C., & Seidl, T. (2021). Enhancing optical-flow-based control by learning visual appearance cues for flying robots. Nature Machine Intelligence, 3(1), 33–41. https://doi.org/10.1038/s42256-020-00279-7

[8] Ranjan, A., & Black, M. J. (2017). Optical flow estimation using a spatial pyramid network. Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 2720–2729. https://doi.org/10.1109/CVPR.2017.291

[9] Hui, T. W., Tang, X., & Loy, C. C. (2018). LiteFlowNet: A Lightweight Convolutional Neural Network for Optical Flow Estimation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8981–8989. https://doi.org/10.1109/CVPR.2018.00936

[10] Sun, D., Yang, X., Liu, M. Y., & Kautz, J. (2017). PWC-Net: CNNs for Optical Flow Using Pyramid, Warping, and Cost Volume. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8934–8943. https://doi.org/10.1109/CVPR.2018.00931

[11] Fan, M., Lai, S., Huang, J., Wei, X., Chai, Z., Luo, J., & Wei, X. (2021). Rethinking BiSeNet For Real-time Semantic Segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 9711–9720. https://doi.org/10.1109/CVPR46437.2021.00959

[12] Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., & Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. In arXiv. arXiv. http://arxiv.org/abs/1704.04861

[13] Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., & Brox, T. (2017). FlowNet 2.0: Evolution of optical flow estimation with deep networks. Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 1647–1655. https://doi.org/10.1109/CVPR.2017.179

[14] Souhila, K., & Karim, A. (2007). Optical flow based robot obstacle avoidance. International Journal of Advanced Robotic Systems, 4(1), 2. https://doi.org/10.5772/5715

[15] Cho, G., Kim, J., & Oh, H. (2019). Vision-based obstacle avoidance strategies for MAVs using optical flows in 3-D textured environments. Sensors, 19(11), 2523. https://doi.org/10.3390/s19112523

This weeks guest blog post is from Hanna Müller, Vlad Niculescu and Tommaso Polonelli, who are working with Luca Benini at the Integrated Systems Lab and Michele Magno at the Center for Project-Based Learning, both at ETH Zürich. Enjoy!

This blog post will give you some insight into our current work towards autonomous flight on nano-drones using a miniaturized multi-zone depth sensor. Here we will mainly talk about obstacle avoidance, as it is our first building block towards fully autonomous navigation. Who knows, maybe in the future, we will have the honor to write another blog post about localization and mapping ;)

A Crazyflie 2.1 with our custom multi-zone ToF deck, a flow deck and a vicon marker.

Obstacle avoidance on nano-drones is challenging, as the restricted payload limits on-board sensors and computational power. Most approaches, therefore, use lightweight and ultra-low-power monocular cameras (as the AI-deck) or 1d depth sensors (as the multi-ranger deck). However, both those approaches have drawbacks – the camera images need extensive processing, usually even neural networks to detect obstacles. Neural networks additionally need training data and are prone to fail in completely new scenarios. The 1d depth sensors can reliably detect obstacles in their field of view (FoV); however, no information about the size or exact position of the obstacle is obtained.


On bigger drones, usually lidars or radars are used, but unfortunately, due to the limited weight and power consumption, those cannot be carried and used on nano-drones. However, in 2021 STMicroelectronics introduced a new multi-zone Time-of-Flight (ToF) sensor – with maximal 8×8 pixel resolution, a range up to 4m (according to the datasheet), a small form-factor and low power consumption of only 286mW (typical) it is ideal to use on nano-drones.


In the picture on top, you can see the Crazyflie 2.1 with our custom ToF deck (open-sourced at https://github.com/ETH-PBL/Matrix_ToF_Drones). We described this deck for the first time in [1], together with a sensor characterization. From this, we saw that we could use the sensor in different light conditions and on different colored obstacles, but from 2m on, the measurements started to get incomplete in all scenarios. However, as the sensor can detect invalid measurements (due to interference or obstacles being out of range), we can still rely on our information. In [2], we presented the system and some steps towards obstacle avoidance in a demo abstract, as you can see in the video below:

The next thing we did was to collect a dataset – we flew with different combinations of decks (flow-deck v2, AI-deck, our custom multi-zone ToF deck) and sometimes even tracked by a vicon system. Those recordings amount to an extensive dataset with depth images, RGB images, internal state estimation and the position and attitude ground truth.


We then fed the recorded data into a python simulation to develop an obstacle avoidance algorithm. We focused on only the ToF data (we are not fusing with the camera in this project, we just provide the data for future work). We aimed for a very efficient solution – because we want it to run on-board, on the STM32F405, with low latency and without occupying too many resources. Our algorithm is very lightweight but highly effective – we divide the FoV in different zones, according to how dangerous obstacles in those areas are and then use a decision tree to decide on a steering angle and velocity.


With only using up 0.31% of the computational power and 210 μs latency, we reached our goal of developing an efficient obstacle avoidance algorithm. Our system is also low-power, the power to lift the additional sensor with all accompanying electronics as well as the supply of it totals in less than 10% of the whole drone. On average, our system reaches a flight time of around 7 minutes. We refer to our preprint [3] for details on our various tests – they include flights with distances up to 212 m and 100% reliability and high agility at a low speed in an office environment.

As our paper is currently submitted but not yet accepted our code and dataset are not yet released – however, the hardware design is already accessible: https://github.com/ETH-PBL/Matrix_ToF_Drones

[1] V. Niculescu, H. Müller, I. Ostovar, T. Polonelli, M. Magno and L. Benini, “Towards a Multi-Pixel Time-of-Flight Indoor Navigation System for Nano-Drone Applications,” 2022 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), 2022, pp. 1-6, doi: 10.1109/I2MTC48687.2022.9806701.
[2] I. Ostovar, V. Niculescu, H. Müller, T. Polonelli, M. Magno and L. Benini, “Demo Abstract: Towards Reliable Obstacle Avoidance for Nano-UAVs,” 2022 21st ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), 2022, pp. 501-502, doi: 10.1109/IPSN54338.2022.00051.
[3] H.Müller, V. Niculescu, T. Polonelli, M. Magno and L. Benini “Robust and Efficient Depth-based Obstacle Avoidance for Autonomous Miniaturized UAVs”, submitted to IEEE, preprint: https://arxiv.org/abs/2208.12624

This week’s guest blogpost is from Xinyu Cai from the research group of ShaoHui Foong, located in the Engineering Product Development Faculty from Singapore University of Technology and Design. Please check out their youtube channel. Enjoy!

Unmanned Aerial Vehicles (UAVs) have garnered much attention from both researchers and engineers in recent decades. Aerial robots in general are classified into mainly three categories: fixed wings, rotary wings and flapping wings.

Fixed wings are one of the most common aerial vehicles as it has relatively higher power efficiency and payload capacity than other types, thanks to their big and highly customizable wing. But this also leads to a bigger footprint and usually the lack of ability for Vertical Taking Off and Landing (VTOL). Rotary wings generally include helicopter and multirotors (such as quadrotors), and they have recently become increasingly popular in our daily lives. Easily achieving great performance in attitude and position control, rotary wings are widely applied in many fields. Flapping wing robots take inspirations from small flapping insects (such as Harvard Robobee) or birds (Purdue Hummingbird Robot).

Fig: A simple prototype of SAM from SUTD with Crazyflie Bolt.

Monocopters are largely inspired from the falling motion of maple seeds, and they are relatively much simpler to build as compared to its counterparts. They can keep a relative smaller footprint and achieve decent control performance although they are highly underactuated. The Single Actuator Monocopter (SAM) has the ability to VTOL, perform 3D trajectory tracking as well as maintain high hovering efficiency. With those advantages, rapid developments have been made in recent years such as the Foldable Single Actuator Monocopter (F-SAM) and Modular Single Actuator Monocopter (M-SAM) from Engineering Product Development (EPD) of Singapore University of Technology and Design (SUTD).

Taking inspiration from nature – Samara inspired monocopter

A descending samara or maple seed, is able to passively enter auto-rotation motion and stabilize its flight attitude, helping to slow down its descent speed and travel further for better survival of the species. This natural behavior attracts interests from scientists and researchers. With previous studies, we learnt that this passive attitude stability is mainly guaranteed by mass distribution (Center of Mass) and wing geometry (Center of Pressure) as well as the rotation motion.

A maple seed inspired Single Actuator Monocopter (SAM).

The SAM is designed to be very close in its mechanical make-up to its natural sibling, having a large single wing structure and a smaller, denser ‘seed’ structure. A single motor with propeller is installed on the leading edge, parallel to the wing surface. Comparing with flight dynamics of the original maple seed, SAM has extra torques and force caused by the spinning propeller, including a reaction torque and thrust directly from propeller, as well as an extra torque caused by precession motion. As a result, the balance of the combined forces and torques allows SAM to enter a new equilibrium condition while still retaining the passive attitude stability.

Development of monocopters

The research on monocopters can be traced back to a long time ago. Here are some examples of different types of air frame to roughly introduce their developments. An air-frame called Robotic Samara [1] was created in 2010, which has a motor to provide rotational force, a servo to control collective pitch of the wing, a winged body fabricated by carbon fiber, and a lipo battery. In the following year, Samarai MAV [2] was developed by following the mass distribution of a natural maple seed. To achieve the control, a servo is equipped to regulate the wing flap. In 2020, a single actuator monocopter was introduced with a simplified air-frame [3]. The main structure is made by laminated balsa wood while the trailing edge of the wing is made by foam for better mass distribution. By making use of the passive attitude stability, only one actuator is required to control the position in 3D space. Based on which, F-SAM [4] and M-SAM [5] were developed in 2021 and 2022 respectively.

SAM with foldable wing structure (F-SAM).

A Modular SAM (M-SAM) with Crazyflie Bolt

Thanks to its easy implementation and reliable performance, we use the Crazyflie Bolt as the flight controller for M-SAM. Like other robotic systems, the ground station is integrated with motion capture system (position and attitude feedback for both control and ground truth) and a joystick (control reference directly generated by user) is responsible for sending filtered state feedbacks and control references or control signal directly to flight controller. This is realized by employing the Crazyradio PA under the Crazyflie-lib-python environment. Simple modifications from the original firmware were made to map from the control reference to motor command (a customized flight controller).

A diagram shows how Crazyflie Bolts work in M-SAM project.

Another advantage of using Crazyflie Bolt in M-SAM project is its open source swarm library. Under the swarm environment, SAMs can fly in both singular and cooperative configurations. With simple human assistance, two SAMs can be assembled into cooperative configuration by making use of a pair of magnetic connectors. The mid-air separation from cooperative configuration to singular configuration is passively triggered by increasing the rotating speed until the centrifugal force overcomes the magnetic force.

Modular Single Actuator Monocopters (M-SAM), which is able to fly in both singular and cooperative configuration.

Potential applications

What kinds of applications can be achieved with the monocopter aerial robotic platform? On the one hand, many applications are limited by the nature of self-rotation motion. On the other hand, the passive rotating body also offers advantages in some special scenarios. For example, SAM is an ideal platform for LIDAR application, which usually requires the rotating motion to sense the environment around. Besides, thanks to simple mechanical design and cheap manufacturing cost, SAM can be designed for one time use such as light weight air deployment or unknown, dangerous environments.

An example [6] shows the potential applications of a rotating robot with camera.

Reference

  • [1] Ulrich, Evan R., Darryll J. Pines, and J. Sean Humbert. “From falling to flying: the path to powered flight of a robotic samara nano air vehicle.” Bioinspiration & biomimetics 5, no. 4 (2010): 045009.
  • [2] Fregene, Kingsley, David Sharp, Cortney Bolden, Jennifer King, Craig Stoneking, and Steve Jameson. “Autonomous guidance and control of a biomimetic single-wing MAV.” In AUVSI Unmanned Systems Conference, pp. 1-12. Arlington, VA: Assoc. for Unmanned Vehicle Systems International, 2011.
  • [3] Win, Luke Soe Thura, Shane Kyi Hla Win, Danial Sufiyan, Gim Song Soh, and Shaohui Foong. “Achieving efficient controlled flight with a single actuator.” In 2020 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), pp. 1625-1631. IEEE, 2020.
  • [4] Win, Shane Kyi Hla, Luke Soe Thura Win, Danial Sufiyan, and Shaohui Foong. “Design and control of the first foldable single-actuator rotary wing micro aerial vehicle.” Bioinspiration & Biomimetics 16, no. 6 (2021): 066019.
  • [5] X. Cai, S. K. H. Win, L. S. T. Win, D. Sufiyan and S. Foong, “Cooperative Modular Single Actuator Monocopters Capable of Controlled Passive Separation,” 2022 International Conference on Robotics and Automation (ICRA), 2022, pp. 1989-1995, doi: 10.1109/ICRA46639.2022.9812182.
  • [6] Bai, Songnan, Qingning He, and Pakpong Chirarattananon. “A bioinspired revolving-wing drone with passive attitude stability and efficient hovering flight.” Science Robotics 7, no. 66 (2022): eabg5913.

This week we have a guest blog post from Jiawei Xu and David Saldaña from the Swarmslab at Lehigh University. Enjoy!

Limits of flying vehicles

Advancements in technology have made quadrotor drones more accessible and easy to integrate into a wide variety of applications. Compared to traditional fixed-wing aircraft, quadrotors are more flexible to design and more suitable for motioning, such as statically hovering. Some examples of quadrotor applications include photographers using mounting cameras to take bird’s eye view images, and delivery companies using them to deliver packages. However, while being more versatile than other aerial platforms, quadrotors are still limited in their capability due to many factors. 

First, quadrotors are limited by their lift capacity, i.e., strength. For example, a Crazyflie 2.1 is able to fly and carry a light payload such as an AI deck, but it is unable to carry a GoPro camera. A lifter quadrotor that is equipped with more powerful components can transport heavier payload but also consumes more energy and requires additional free space to operate. The difference in the strength of individual quadrotors creates a dilemma in choosing which drone components are better suited for a task.

Second, a traditional quadrotor’s motion in translation is coupled with its roll and pitch. Let’s take a closer look at Crazyflie 2.1, which utilizes a traditional quadrotor design. Its four motors are oriented in the same direction – along the positive z-axis of the drone frame, which makes it impossible to move horizontally without tilting. While such control policies that convert the desired motion direction into tilting angles are well studied, proven to work, and implemented on a variety of platforms [1][2], if, for instance, we want to stack a glass filled with milk on top of a quadrotor and send it from the kitchen to the bedroom, we should still expect milk stains on the floor. This lack of independent control for rotation and translation is another primary reason why multi-rotor drones lack versatility.

Fig 1. A crazyflie has four propellers generating thrust forces in parallel. Credit to: https://robots.ros.org/crazyflie/

Improving strength

These versatility problems are caused by the hardware of a multi-rotor drone designed specifically to deal with a certain set of tasks. If we push the boundary of these preset tasks, the requirements on the strength and controllability of the multi-rotor drone will eventually be impossible to satisfy. However, there is one inspiration we take from nature to improve the versatility in the strength of multi-rotor drones – modularity! Ants are weak individual insects that are not versatile enough to deal with complex tasks. However, when a group of ants needs to cross natural boundaries, they will swarm together to build capable structures like bridges and boats. In our previous work, ModQuad [3], we created modules that can fly by themselves and lift light payloads. As more ModQuad modules assemble together into larger structures, they can provide an increasing amount of lift force. The system shows that we can combine weak modules with improving the versatility of the structure’s carrying weight. To carry a small payload like a pin-hole camera, a single module is able to accomplish the task. If we want to lift a heavier object, we only need to assemble multiple modules together up to the required lift.

Improving controllability

On a traditional quadrotor, each propeller is oriented vertically. This means the device is unable to generate force in the horizontal direction. By attaching modules side by side in a ModQuad structure, we are aligning more rotors in parallel, which still does not contribute to the horizontal force the structure can generate. That is how we came up with the idea of H-ModQuad — we would like to have a versatile multi-rotor drone that is able to move in an arbitrary direction at an arbitrary attitude. By tilting the rotors of quadrotor modules and docking different types of modules together, we obtain a structure whose rotors are not pointing in the same direction, some of which are able to generate a force along the horizontal direction.

H-ModQuad Design

H-ModQuad has two major characteristics: modularity and heterogeneity, which can be indicated by the “Mod” and “H-” in the name. Modularity means that the vehicle (we call a structure) is composed of multiple smaller modules which are able to fly by themselves. Heterogeneity means that we can have modules of different types in a structure. 

As mentioned before, insects like ants utilize modularity to enhance the group’s versatility. Aside from a large number of individuals in a swarm that can adapt to the different scales of the task requirement, the individuals in a colony specializing in different tasks are of different types, such as the queen, the female workers, and the males. The differentiation of the types in a hive helps the group adapt to tasks of different physical properties. We take this inspiration to develop two types of modules.

In our related papers [4][5], we introduced two types of modules which are R-modules and T-modules.

Fig 2. Major components of an H-ModQuad “T-module” we are using in our project. We use Bitcraze Crazyflie Bolt as the central control board.

An example T-module is shown in the figure above. As shown in the image, the rotors in a T-module are tilted around its arm connected with the central board. Each pair of diagonal rotors are tilted in the opposite direction, and each pair of adjacent rotors are either tilting in the same direction or in the opposite direction. We arrange the tilting of the rotors so that all the propellers generate the same thrust force, making the structure torque-balanced. The advantage of the T-module is that it allows the generation of more torque around the vertical axis. One single module can also generate forces in all horizontal directions.

An R-module has all its propellers oriented in the same direction that is not on the z-axis of the module. In this configuration, when assembling multiple modules together, rotors from different modules will point in different directions in the overall structure. The picture below shows a fully-actuated structure composed of R-modules. The advantage of R-modules is that the rotor thrusts inside a module are all in the same direction, which is more efficient when hovering.

Structure 1: Composed of four types of R-modules.

Depending on what types of modules we choose and how we arrange those modules, the assembled structure can obtain different actuation capabilities. Structure 1 is composed of four R-modules, which is able to translate in horizontal directions efficiently without tilting. The picture in the intro shows a structure composed of four T-modules of two types. It can hover while maintaining a tilting angle of up to 40 degrees.

Control and implementation

We implemented our new geometric controller for H-ModQuad structures based on Crazyflie Firmware on Crazyflie Bolt control boards. Specifically, aside from tuning the PID parameters, we have to change the power_distribution.c and controller_mellinger.c so that the code conforms to the structure model. In addition, we create a new module that embeds the desired states along predefined trajectories in the firmware. When we send a timestamp to a selected trajectory, the module retrieves and then sends the full desired state to the Mellinger Controller to process. All modifications we make on the firmware so that the drone works the way we want can be found at our github repository. We also recommend using the modified crazyflie_ros to establish communication between the base station and the drone.

Videos

Challenges and Conclusion

Different from the original Crazyflie 2.x, Bolt allows the usage of brushless motors, which are much more powerful. We had to design a frame using carbon fiber rods and 3-D printed connecting parts so that the chassis is sturdy enough to hold the control board, the ESC, and the motors. It takes some time to find the sweet spot of the combination of the motor model, propeller size, batteries, and so on. Communicating with four modules at the same time is also causing some problems for us. The now-archived ROS library, crazyflie_ros, sometimes loses random packages when working with multiple Crazyflie drones, leading to the stuttering behavior of the structure in flight. That is one of the reasons why we decided to migrate our code base to the new Crazyswarm library instead. The success of our design, implementation, and experiments with the H-ModQuads is proof of work that we are indeed able to use modularity to improve the versatility of multi-rotor flying vehicles. For the next step, we are planning to integrate tool modules into the H-ModQuads to show how we can further increase the versatility of the drones such that they can deal with real-world tasks.

Reference

[1] D. Mellinger and V. Kumar, “Minimum snap trajectory generation and control for quadrotors,” in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 2520–2525.

[2] T. Lee, M. Leok, and N. H. McClamroch, “Geometric tracking control of a quadrotor uav on se(3),” in 49th IEEE Conference on Decision and Control (CDC), 2010, pp. 5420–5425.

[3] D. Saldaña, B. Gabrich, G. Li, M. Yim and V. Kumar, “ModQuad: The Flying Modular Structure that Self-Assembles in Midair,” 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 691-698, doi: 10.1109/ICRA.2018.8461014.

[4] J. Xu, D. S. D’Antonio, and D. Saldaña, “Modular multi-rotors: From quadrotors to fully-actuated aerial vehicles,” arXiv preprint arXiv:2202.00788, 2022.

[5] J. Xu, D. S. D’Antonio and D. Saldaña, “H-ModQuad: Modular Multi-Rotors with 4, 5, and 6 Controllable DOF,” 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 190-196, doi: 10.1109/ICRA48506.2021.9561016.