Category: Video

It’s time for a new compilation video about how the Crazyflie is used in research ! The last one featured already a lot of awesome work, but a lot happened since then, both in research and at Bitcraze.

As usual, the hardest about making those videos is choosing the works we want to feature – if every cool video of the Crazyflie was in there, it would last for hours! So it’s just a selection of the most videogenic projects we’ve seen. You can find a more extensive list of our products used in research here.

We’ve seen a lot of projects that used the modularity of the Crazyflie to create awesome new features, like a catenary robot, some wall tracking or having it land upside down. The Crazyflie board was even made into a revolving wing drone. New sensors were used, to sniff out gas leaks (the Sniffy bug as seen in this blogpost), or to allow autonomous navigation. Swarms are also a research topic where we see a lot of the Crazyflie, this time for collision avoidance, or path planning. We also see more and more of simulators, which are used for huge swarms or physics tests.

Once again, we were surprised and awed by all the awesome things that the community did with the Crazyflie. Hopefully, this will inspire others to think of new things to do as well. We hope that we can continue with helping you to make your ideas fly, and don’t hesitate to share with us the awesome projects you’re working on!

Here is a list of all the research that has been included in the video:

And, without further ado, here it is:

My name is Hanna, and I just started as an intern at Bitcraze. However, it is not my first time working with a drone or even the Crazyflie, so I’ll tell you a bit about how I ended up here.

The first time I used a drone, actually even a Crazyflie, was in a semester thesis at ETH Zurich in 2017, where my task was to extend a Crazyflie with a Parallel Ultra Low-Power (PULP) System-on-Chip (SoC) connected to a camera and external memory. This was the first prototype of the AI-deck you can buy here nowadays (as used here) :)

My next drone adventure was an internship at a company building tethered drones for firefighters – a much bigger system than the Crazyflie. I was in charge of the update system, so more on the firmware side this time. It was a very interesting experience, but I swore never to build a system with more than three microcontrollers in it again.

This and a liking for tiny and restricted embedded systems brought me back to the smaller drones again. I did my master thesis back at ETH about developing a PULP-based nano-drone (nano-drones are just tiny drones that fit approximately in the palm of your hand and use only around 10Watts of power, the category Crazyflies fit in) and some onboard intelligence for it. As a starting point, we used the Crazyflie, both for the hardware and the software. It turned out to be a very hard task to port the firmware to a processor with only a very basic operating system at that time. Still, eventually I knew almost every last detail of the Crazyflie firmware, and it actually flew.

However, for this to happen, I needed some more time than the master thesis – in the meantime, I started to pursue a PhD at ETH Zurich. I am working towards autonomous miniaturized drones – so besides the part with the tiny PULP-based drone I already told you about, I also work on the “autonomous” part. Contrary to many other labs our focus is not only on novel algorithms though, we also work with novel sensors and processors. Two very interesting recent developments for us are a multi-zone Time-of-Flight sensor and the novel gap9 processor, which both fit on a Crazyflie in terms of power, size and weight. This enables new possibilities in obstacle avoidance, localization, mapping and many more. Last year my colleagues and I already posted a blog post about our newest advances in obstacle avoidance (here, with Videos!). More recently, we worked on onboard localization, using novel multi-zone Time-of-Flight sensors and the very new GAP9 processor to execute Monte Carlo localization onboard a Crazyflie (arxiv).

On the left you see an example of a multi-zone Time-of-Flight image (the background is a picture from the AI-deck), from here. On the right you see our prototype for localization in action – from our DATE23 paper (arxiv).

For me, localizing with a given map is a fascinating topic and one of the reasons I ended up in Sweden. It is one of the most basic skills of robots or even humans to navigate from A to B as fast as possible, and the basis of my favourite sport. The sport is called “orienteering” and is about running as fast as possible to some checkpoints on a map, usually through a forest. It is a very common sport in Sweden, which is the reason I started learning Swedish some years ago. So when the opportunity to go to Malmö for some months to join Bitcraze presented itself, I was happy to take it – not only because I like the company philosophy, but also because I just like to run around in Swedish forests :)

Now I am looking forward to my time here, I hope to learn lots about drones, firmware, new sensors, production, testing, company organization and to meet a lot of new nice people!

Greetings from Malmö – it can be a bit cold and rainy, but the sea and landscape are beautiful!

Hanna

This week’s guest blogpost is from Frederike Dümbgen presenting her latest work from her PhD project at the Laboratory of Audiovisual Communications (LCAV), EPFL, and is currently a Postdoc at the University of Toronto. Enjoy!

Bats navigate using sound. As a matter of fact, the ears of a bat are so much better developed than their eyes that bats cope better with being blindfolded than they cope with their ears being covered. It was precisely this experiment that helped the discovery of echolocation, which is the principle bats use to navigate [1]. Broadly speaking, in echolocation, bats emit ultrasonic chirps and listen for their echos to perceive their surroundings. Since its discovery in the 18th century, astonishing facts about this navigation system have been revealed — for instance, bats vary chirps depending on the task at hand: a chirp that’s good for locating prey might not be good for detecting obstacles and vice versa [2]. Depending on the characteristics of their reflected echos, bats can even classify certain objects — this ability helps them find, for instance, water sources [3]. Wouldn’t it be amazing to harvest these findings in building novel navigation systems for autonomous agents such as drones or cars?

Figure 1: Meet “Crazybat”: the Crazyflie equipped with our custom audio deck including 4 microphones, a buzzer, and a microcontroller. Together, they can be used for bat-like echolocation. The design files and firmware of the audio extension deck are openly available, as is a ROS2-based software stack for audio-based navigation. We hope that fellow researchers can use this as a starting point for further pushing the limits of audio-based navigation in robotics. More details can be found in [4].

The quest for the answer to this question led us — a group of researchers from the École Polytechnique Fédérale de Lausanne (EPFL) — to design the first audio extension deck for the Crazyflie drone, effectively turning it into a “Crazybat” (Figure 1). The Crazybat has four microphones, a simple piezo buzzer, and an additional microprocessor used to extract relevant information from audio data, to be sent to the main processor. All of these additional capabilities are provided by the audio extension deck, for which both the firmware and hardware design files are openly available.1

Video 1: Proof of concept of distance/angle estimation in a semi-static setup. The drone is moved using a stepper motor. More details can be found in [4].

In our paper on the system [4], we show how to use chirps to detect nearby obstacles such as glass walls. Difficult to detect using a laser or cameras, glass walls are excellent sound reflectors and thus a good candidate for audio-based navigation. We show in a first semi-static feasibility study that we can locate the glass wall with centimeter accuracy, even in the presence of loud propeller noise (Video 1). When moving to a flying drone and different kinds of reflectors, the problem becomes significantly more challenging: motion jitter, varying propeller noise and tight real-time constraints make the problem much harder to solve. Nevertheless, first experiments suggest that sound-based wall detection and avoidance is possible (Figure and Video 2).

Video 2: The “Crazybat” drone actively avoiding obstacles based on sound.
Figure 2: Qualitative results of sound-based wall localization on the flying “Crazybat” drone. More details can be found in [4].

The principle we use to make this work is sound-based interference. The sound will “bounce off” the wall, and the reflected and direct sound will interfere either constructively or destructively, depending on the frequency and distance to the wall. Using this same principle for the four microphones, both the angle and the distance of the closest wall can be estimated. This is however not the only way to navigate using sound; in fact, our software stack, available as an open-source package for ROS2, also allows the Crazybat to extract the phase differences of incoming sound at the four microphones, which can be used to determine the location of an external sound source. We believe that a truly intelligent Crazybat would be able to switch between different operating modes depending on the conditions, just like bats that change their chirps depending on the task at hand.

Note that the ROS2 software stack is not limited to the Crazybat only — we have isolated the hardware-dependent components so that the audio-based navigation algorithms can be ported to any platform. As an example, we include results on the small wheeled e-puck2 robot in [4], which shows better performance than the Crazybat thanks to the absence of propeller noise and motion jitter.

This research project has taught us many things, above all an even greater admiration for the abilities of bats! Dealing with sound is pretty hard and very different from other prevalent sensing modalities such as cameras or lasers. Nevertheless, we believe it is an interesting alternative for scenarios with poor eyesight, limited computing power or memory. We hope that other researchers will join us in the quest of exploiting audio for navigation, and we hope that the tools that we make publicly available — both the hardware and software stack — lower the entry barrier for new researchers. 

1 The audio extension deck works in a “plug-and-play” fashion like all other extension decks of the Crazyflie. It has been tested in combination with the flow deck, for stable flight in the absence of a more advanced localization system. The deck performs frequency analysis on incoming raw audio data from the 4 microphones, and sends the relevant information over to the Crazyflie drone where it is converted to the CRTP protocol on a custom driver and sent to the base station for further processing in the ROS2 stack.

References

[1] Galambos, Robert. “The Avoidance of Obstacles by Flying Bats: Spallanzani’s Ideas (1794) and Later Theories.” Isis 34, no. 2 (1942): 132–40. https://doi.org/10.1086/347764.

[2] Fenton, M. Brock, Alan D. Grinnell, Arthur N. Popper, and Richard R. Fay, eds. “Bat Bioacoustics.” In Springer Handbook of Auditory Research, 1992. https://doi.org/10.1007/978-1-4939-3527-7.

[3] Greif, Stefan, and Björn M Siemers. “Innate Recognition of Water Bodies in Echolocating Bats.” Nature Communications 1, no. 106 (2010): 1–6. https://doi.org/10.1038/ncomms1110.

[4] F. Dümbgen, A. Hoffet, M. Kolundžija, A. Scholefield and M. Vetterli, “Blind as a Bat: Audible Echolocation on Small Robots,” in IEEE Robotics and Automation Letters (Early Access), 2022. https://doi.org/10.1109/LRA.2022.3194669.

It’s the end of the year, and as usual, it’s time to be a little nostalgic and look back at what happened at Bitcraze during the last 12 months.

Community

2022 marked the easing out of the pandemic; and we finally got the opportunity to do onsite, physical conferences for the first time since 2020.

First, it was Kimberly alone that spend some time in the spring to visit some of our users across labs in Europe (we called it the Grand Tour). Then we visited IMAV, in the Netherlands, were we saw an amazing competition involving the AI deck. We actually also had the Crazyflie feature in an hackathon in Stockholm, in June.

But the conferences we’ve been longing for the most, and that took a good chunk of our time, was IROS and ROSCon in Japan. Preparations were intense, and for the first time, all of us were gone during one week ! Our intern Marios worked on the demo during the summer, and we presented a fully autonomous demo. We were really glad to spend time in this beautiful country to show our stuff, meeting people and discover new ways researchers use the Crazyflie.

We also had our very first Mini BAM, with Flapper Drones and CollMot. Worth of note, Mark Robber used the Crazyflie as a glitter dispenser in his latest video, in which he designed the drone to fly (without a positioning system!) from a box where it charged all the time.

Guest blog posts

And since we had more opportunities to meet our customers, we also had some interesting visits on our blog !

Software

We worked on 5 releases this year!

We finally got the AI deck out of early access, with new improved infrastructure. We even got a nice example of using the AI deck for CRTP over WiFi (via CPX) !

We also spent some time on our positioning systems. One big win at the beginning of the year was to add the possibilities to have more than 2 base stations with Lighthouse. We also improved the Lighthouse geometry estimation. But Lighthouse was not the only one to receive our love, we worked on scaling up the The Loco positioning system that was nicely demonstrated in the New Year’s video.

Kimberly created a nice simulation model for the Crazyflie, now officially available in Gazebo. We also switched to K-build. And the development of Crazyswarm2 and implementation of ROS2 took (and is still talking) some time.

Hardware

We got new motors and propellers for increased thrust , they are now available in the store! For the first time, we will also have a product made and designed by a third party, namely the Nimble + designed by Flapper Drones. I heard that the Christmas elves are working hard to get it to us soon !

We also had some upgrades on the Lighthouse, SD-card and Biq-Quad deck.

This last couples of month, we also dedicated a lot of time on a new Crazyradio and new communication architecture.

Documentation

After 10 years of loyal services, we retired the forum, in favor of github discussions. We also improved the client with CFclient: GUI, Lighthouse and Bolt improvements and some debug Tools.

Bitcraze

A lot changed here too ! Jonas left and Arnaud took his parental leave, so with 2 men short we felt quite under staffed… That’s why we started looking for new Bitcrazers to join the team.

Thankfully, some people joined in, though temporarily. Marios worked here during the summer, and Victor joined us part time to help out too.

As usual, it’s always nice to see all the things we’ve done in the span of one year, and we’re happy with the progress we’ve made in 2022!

This year, the traditional Christmas video was overtaken by a big project that we had at the end of November: creating a test show with the help of CollMot.

First, a little context: CollMot is a show company based in Hungary that we’ve partnered with on a regular basis, having brainstorms about show drones and discussing possibilities for indoor drones shows in general. They developed Skybrush, an open- source software for controlling swarms. We have wanted to work with them for a long time.

So, when the opportunity came to rent an old train hall that we visit often (because it’s right next to our office and hosts good street food), we jumped on it. The place itself is huge, with massive pillars, pits for train maintenance, high ceiling with metal beams and a really funky industrial look. The idea was to do a technology test and try out if we could scale up the Loco positioning system to a larger space. This was also the perfect time to invite the guys at CollMot for some exploring and hacking.

The train hall

The Loco system

We added the TDoA3 Long Range mode recently and we had done experiments in our test-lab that indicate that the Loco Positioning systems should work in a bigger space with up to 20 anchors, but we had not actually tested it in a larger space.

The maximum radio range between anchors is probably up to around 40 meters in the Long Range mode, but we decided to set up a system that was only around 25×25 meters, with 9 anchors in the ceiling and 9 anchors on the floor placed in 3 by 3 matrices. The reason we did not go bigger is that the height of the space is around 7-8 meters and we did not want to end up with a system that is too wide in relation to the height, this would reduce Z accuracy. This setup gave us 4 cells of 12x12x7 meters which should be OK.

Finding a solution to get the anchors up to the 8 meters ceiling – and getting them down easily was also a headscratcher, but with some ingenuity (and meat hooks!) we managed to create a system. We only had the hall for 2 days before filming at night, and setting up the anchors on the ceiling took a big chunk out of the first day.

Drone hardware

We used 20 Crazyflie 2.1 equipped with the Loco deck, LED-rings, thrust upgrade kit and tattu 350 mAh batteries. We soldered the pin-headers to the Loco decks for better rigidity but also because it adds a bit more “height-adjust-ability” for the 350 mAh battery which is a bit thicker then the stock battery. To make the LED-ring more visible from the sides we created a diffuser that we 3D-printed in white PLA. The full assembly weighed in at 41 grams. With the LED-ring lit up almost all of the time we concluded that the show-flight should not be longer than 3-4 minutes (with some flight time margin).

The show

CollMot, on their end, designed the whole show using Skyscript and Skybrush Studio. The aim was to have relatively simple and easily changeable formations to be able to test a lot of different things, like the large area, speed, or synchronicity. They joined us on the second day to implement the choreography, and share their knowledge about drone shows.

We got some time afterwards to discuss a lot of things, and enjoy some nice beers and dinner after a job well done. We even had time on the third day, before dismantling everything, to experiment a lot more in this huge space and got some interesting data.

What did we learn?

Initially we had problems with positioning, we got outliers and lost tracking sometimes. Finally we managed to trace the problems to the outlier filter. The filter was written a long time ago and the current implementation was optimized for 8 anchors in a smaller space, which did not really work in this setup. After some tweaking the problem was solved, but we need to improve the filter for generic support of different system setups in the future.

Another problem that was observed is that the Z-estimate tends to get an offset that “sticks” and it is not corrected over time. We do not really understand this and will require more investigations.

The outlier filer was the only major problem that we had to solve, otherwise the Loco system mainly performed as expected and we are very happy with the result! The changes in the firmware is available in this, slightly hackish branch.

We also spent some time testing maximum velocities. For the horizontal velocities the Crazyflies started loosing positioning over 3 m/s. They could probably go much faster but the outlier filter started having problems at higher speeds. Also the overshoot became larger the faster we flew which most likely could be solved with better controller tuning. For the vertical velocity 3 m/s was also the maximum, limited by the deceleration when coming downwards. Some improvements can be made here.

Conclusion is that many things works really well but there are still some optimizations and improvements that could be made to make it even more robust and accurate.

The video

But, enough talking, here is the never-seen-before New Year’s Eve video

And if you’re curious to see behind the scenes

Thanks to CollMot for their presence and valuable expertise, and InDiscourse for arranging the video!

And with the final blogpost of 2022 and this amazing video, it’s time to wish you a nice New Year’s Eve and a happy beginning of 2023!

Hey, Victor here!

I’ve been flying FPV drones for some time and while I usually fly bigger drones (3-5 inch props) I have always wanted to put an analog camera on the Crazyflie to fly it in FPV. So, a few weeks ago I put together a simple FPV deck using off-the shelf components! The deck simply consists of a camera, VTX and a DC-DC converter, soldered onto a prototype deck.

The deck is very simple and consists of only four components and the price (as of writing) is approximately 50$ in total.

  1. Prototyping deck
  2. Camera: RunCam Atom 10x10mm 800TVL FPV Camera
  3. VTX: TBS Unify Pro Nano 5G8
  4. DC-DC converter: Voltage 5V boost converter (necessary since the camera and the VTX requries 5V.)

I did the wiring as follows:

I soldered the components onto the prototype deck and used some hot glue to attach the camera, as well as on and around the antenna to prevent it from breaking off when crashing. The deck weighs a total of 8.5 grams including connection pins.

I used the newly released upgrade kit on the Crazyflie which made it easier to fly since the motors and propellers makes the drone a lot faster and easier to control flying manually. The upgrade kit also increases the lift capacity of the drone, which is nice so that the extra weight of the camera deck doesn’t become a problem.

Radio Controller

When flying FPV race drones you typically want a nice radio controller and there are many options to choose from. I recently got myself a RadioMaster Zorro Radio Controller – 4-in-1 Multi-Protocol which supports a whole variety of different RC protocols, including the popular ones such as frsky, flysky and many more. You can run the popular OpenTX or EdgeTX firmware on it and the controller is equipped with multiple RF chips, whereas one of the chips is the nRF24L01. This means that we can control the Crazyflie with the controller! While I expected several hacks to make this work, thanks to the awesome Bitcraze community someone had already written support for the Crazyflie for the controller.

Below are the steps that I took to control the Crazyflie using a RadioMaster Zorro 4-in-1 controller. In short, we want two different firmwares: 1) Firmware for the remote controller (like the controller OS). 2) Firmware for the internal RF module. Please note that the details of the steps might change in the future, but hopefully it can still be helpful.

  1. Download the latest OpenTX or EdgeTX firmware.
  2. Clone the repository for the internal RF module: DIY Multiprotocol TX Module.
  3. Locate the file Multiprotocol/CFlie_nrf24l01.ino in the repository and set the address of the Crazyflie that you want to connect to in the method CFLIE_initialize_rx_tx_addr().
  4. Ensure that the #define CFLIE_NRF24L01_INO is uncommented in the file Multiprotocol/_Config.h
  5. Download Arduino IDE in order to build the code for the internal RF module.
  6. Open Arduino IDE from the Multiprotocol directory and build the code by Sketch -> Export Compiled Binary. This might take some time since the firmware is quite big. The binary can then be found in Multiprotocol/build/XXX.bin.
  7. Plug in the SD card of the remote controller or connect it to the computer using USB-C and start the controller as a storage device.
  8. Transfer the two firmware binaries to the firmware directory of the radio controller. Unplug the radio controller and install the EdgeTX/OpenTX binary as the radio firmware, and the Multiprotocol binary for the internal RF module.
  9. Create a new model and select the CFLIE protocol.

You should now be ready to fly! So turn on your Crazyflie and ensure that it’s on the address that you assigned in the CFLIE_initialize_rx_tx_addr() method in step 3. The radio should automatically find the correct channel so you shouldn’t have to worry about selecting the right channel.

Conclusions

I think the deck turned out really nice and it’s super cool to fly the Crazyflie in FPV! :) Some notes to consider:

  1. It’s possible to fly with the FPV deck with the normal motors and propellers of the Crazyflie but with the thrust upgrade kit the flying is easier and significantly more enjoyable since you can go a lot faster.
  2. Ensure that the battery is well and fully charged before flying.
  3. There’s no support for On-Screen Display (OSD) on this deck, but it would be a cool thing to test in the future. I believe that most flight controllers that supports onboard OSD has the MAX7456 or AT7456E chip, but there’s probably more ways to do it.
  4. The hot glue loosens up slightly from the heat dissipation of the VTX. I added some extra glue and it seems to hold quite well, even after multiple crashes.
  5. There are modules that contains the camera and the VTX in the same package, which might be a good/better option for the Crazyflie buying them separately and soldering them together.

Please let me know if you’ve found any mistakes in the text above or if you have any other cool ideas or hacks about FPV for the Crazyflie! :)

Cheers,
Victor

IROS in Kyoto is over and all Bitcrazers are finally back in Sweden again. We had a really good time in Japan and enjoyed all the interesting discussions we had with all of you, thanks!

In this blog post we will describe the demo we were running in the both and talk a bit about all the cool tech that was used. If you want to reproduce it at home or just take a look for inspiration, the code is available on github in the iros-2022 branch of our experimental firmware repo. There is also a page on our web for IROS 2022 with some more information.

The demo has similarities with our previous demo (see IROS 2019) but has been upgraded to be a fully autonomous and decentralized swarm with 9 Crazyflies buzzing around in a cage, going back to charging pads for wireless charging when the battery is running out. The demo supports multiple Crazyflies flying at the same time, avoiding collisions without a central authority, all decision making is done in each Crazyflie, that is fully decentralized.

The hardware is off-the-shelf products available in our store (links here). The software is obviously written specifically for the demo, but we wanted to use the building blocks already available in the system so the demo code is mainly “glue” to connect them together.

The cage/flying space

The flying space was box shaped, 3×2 meters in foot print and 2.5 meters high. We enclosed it in our lightweight travel cage made from aluminium pipes and a light net. It is a pretty small space to fly multiple Crazyflies in at the same time but it worked! The main problem with such a small space is down-wash from other Crazyflies and having enough room to avoid collisions. 3 Crazyflies worked pretty well, but had the space been larger it would have been possible to fly all nine.

Localization

Localization was handled by the Lighthouse positioning system. We used two base stations and the lighthouse deck on each Crazyflie which provides the Crazyflies with their current position with high accuracy.

Since the position is computed in the Crazyflie, using only data from on-board sensors, no external communication is needed in relation to the localization system. The only exception was that we uploaded the physical geometry of the system when setting up the cage.

Path planing

When a Crazyflie is flying in the demo, the standard mode of operation is to fly a randomized pattern of straight lines. From time to time (randomized) the Crazyflie can also chose to fly the spiral that we have used in earlier demos (see the IROS 2019 demo for instance).

When the battery is running out, the Crazyflie goes back to the charging pad for charging. The position is sampled before taking off and this coordinate is used as the landing point to find the charging pad. When landed the Crazyflie verifies that the battery is being charged. If the battery is not charging the Crazyflie assumes it missed the charging pad and it takes off again to adjust the position.

Charging

The Crazyflies were equiped with the Qi-charging deck for wireless charging. The charging pads are 3D-printed pads with a slope to make the Crazyflie slide into position also if the landing is not perfect. In the center of the pads there are standard Qi-chargers from IKEA mounted to provide power.

To fly continuously, the system charging rate must be higher than what is consumed by the flying Crazyflies. With a system of nine Crazyflies that are charging through Qi-chargers it is possible to keep one Crazyflie flying, just. To get some margin we increased the charging speed a bit, the down side being that the Crazyflies get warm and the batteries ware out faster.

Collision avoidance

We use the built in collision avoidance system contributed by James Alan Preiss at University of Southern California. Thanks James, it works like a charm!

There is no planing ahead, but each Crazyflie must know where the other Crazyflies are located. Based on this information they avoid each other and chose a new path to reach their target position. For this to work each Crazyflie is continuously broadcasting its position to the other Crazyflies using the peer-to-peer framework.

Swarm control and collaboration

As mentioned earlier there is no central authority that decides which Crazyflie that should take off or go to a specific position, instead this functionality is handled in each Crazyflie. To make it possible for each Crazyflie to have a rough idea of the system state, each Crazyflie is broadcasting its position and state (landed, flying etc) to the other Crazyflies. If a Crazyflie realizes that too few drones are flying, it will simply take off to fix the problem, if it sees that too many are flying it will go back to the charging pad. To avoid that all Crazyflies takes off or lands at the same time, a randomized hold-back time is used before the actions is executed. This does not fully prevent two individuals from taking off at the same time, but makes it less likely, and eventually the correct number of drones will fly.

The number of drones that should fly at the same time is a system wide parameter that can be set from one of the peers in the system. To make sure they all agree on the value, a simple mechanism is used based on the age of the data. The value and the age of the value is included in the broadcast data. When another Crazyflie receives the data it compares the age of the received data with the age of the data it already has and replaces it only if it is younger.

Sniffer

A tenth Crazyflie is used in the demo as a sniffer. It is essentially a non-flying member of the swarm that listens to the broadcast traffic and it is used to feed data to a GUI that displays the state of the system. It can also be used to inject a new value for the desired number of flying Crazyflies.

Implementation and how to run it

The code is mainly implemented as an app in the Crazyflie firmware, using the app layer. The main part is a state machine that keeps track of what to do next with some other modules handling communication and trajectories.

The code is available in the iros-2022 branch of the crazyflie-firmware-experimental repository, in the examples/demos/decentralized_swarm folder.

The examples/demos/decentralized_swarm/src/common_files/choose_app.h file controls if the code is compiled for a swarm member or the sniffer.

All Crazyflies should have the same radio channel and the same address, except the last byte. Swarm members must use addresses ending in 01 to 09 while the sniffer must use the address ending in 00.

The demo is based on the work that Marios did for a decentralized swarm this summer. Thanks Marios!

I already talked about it here and there, but this day finally came: the whole company is in Japan !
Kimberly travelled first, to account for jetlag, meet with some people, and attend ROScon.

It was last week, and she got the opportunity to learn a lot, meet people from the ROS community, and give an exciting talk.

Kimberly’s talk at RosCon (made by Ramón Roche)
Happy to be in Japan (Made by Ramón Roche)

The rest of the company travelled last week with all the equipment needed divided into our suitcases.

Our suitcases at the office, to gather the materials before going

We chose to rent a traditional machiya while there, where we can all stay together and enjoy the life in the center of Kyoto.

Us chilling out in the Bitcraze mansion

Our first day here was to account for jetlag, but we managed to sightsee the amazing sites of Kyoto – and enjoy the most praised Japanese food, much appreciated after a long walk among the Tori gates of the Fushimi Inari shrine.

Us after climbing on top of Mt Inari – with the beautiful path of Tori gates

But it was soon time to start working, and yesterday we worked really hard on setting up everything to have a nice demo at IROS.

After some head scratching, emergency taping and hacking we managed to get the autonomous demo that Marios implemented last summer flying – just before the event hall We got time to explore the Kyoto International Conference Center, a beautiful venue with a Japanese garden and a futuristic look – as imagined in the 70′.

Some views from the Kyoto Conference Center

We invited those of you that are attending IROS to come and see us for a tech meet-up. It’s today and it would be a real nice opportunity for us to finally chat in person with our users ! Since there are a lot of aerial systems talks, we realize it may be difficult to come during the sessions, so the tech meet-up can begin during the break, at 15.40

Next up this week is the safe nanocopter competition. Kimberly will actually deliver the prize for that, we can’t wait to see what this competition will show – and how fun it is to remote-control the Crazyflies that are in the University of Toronto Institute for Aerospace Studies!

Of course, we will share some news on social media – and we will have a blogpost in a few weeks to debrief on the whole trip.

As you’ll understand, maintaining the day-to-day of the company is a little trickier this week, but we still monitor email, github discussions, and are shipping orders. You should just expect a longer time to process those, as we’re too busy – either at the booth or… at karaoke ! (no, there will be no videos of us singing).

As you probably noticed already, this summer I experimented with ROS2 and connecting the Crazyflie with multi-ranger to several mapping and navigation nodes (see this and this blogpost). First I started with an experimental repo on my personal Github account called crazyflie_ros2_experimental, where I managed to do some mapping and navigation already. In August we started porting most of this functionality to the crazyswarm2 project, so that is what this blogpost is mostly about.

Crazyswarm goes ROS2

Most of you are already familiar with Crazyswarm for ROS1, which is a project that Wolfgang Hönig and James Preiss have maintained since its creation in 2017 at the University of Southern California. Since then, many have used and referred to this work, since the paper has been cited more than 260 times. From all the Crazyflie papers of the latest ICRA and IROS conferences, 50 % of the papers have used Crazyswarm as their communication middleware. If you haven’t heard about Crazyswarm yet, please check-out the nice BAMdays talk Wolfgang gave last year.

Unfortunately, ROS1 will not be there forever and will be phased out anno 2025 and will not be supported for Ubuntu 22.04 and up. Therefore, Wolfgang, now at the Intelligent Multi-robot Coordination Lab at TU Berlin, has already started with the ROS2 port of Crazyswarm, namely Crazyswarm2. Here the same principle of the C++ based Crazyflie server and the python wrapper were been implemented, along with the simple position based simulation and Teleop nodes. Mind that the name Crazyswarm2 is just the project name out of historic reasons, but the package itself can also be used for individual Crazyflies as well. That is why the package names will be called crazyflie_*

Porting the Summer Hack project to Crazyswarm2

The crazyflie_ros2_experimental was fun to hack around, as it was (as the name suggests) experimental and I didn’t need to worry about releases, bugfixes etc. However, the problem of developing only here, is that the further you go the more work it becomes to make it more official. That is when Wolfgang and I sat down and started talking about porting what I’ve done in the summer into Crazyswarm2. This is also a good opportunity to get more involved with the project, especially with so many Crazyfliers using the ROS as well.

The first step was to write a second crazyflie_server node that relied on the python CFlib. This means that many of the variables I used to hardcode in the experimental node, needed to be defined within the parameter structure of ROS2. The crazyflies.yaml is where anything relevant for the server (like the URIs and parameters) needs to be defined. Both the C++ backend server and the CFlib backend server are using the same parameters. Also the functionality of the both servers are pretty similar, except for that logging is only possible on the CFlib version and uploading/follow trajectories is only possible on the C++ version. An overview will be provided soon on the Crazyswarm2 documentation website.

The second step was to make the crazyflie_server (cflib) node suitable to be connected to external packages that I’ve worked with during the hack project. Therefore, there are some special logging modes, that enables the server to not only output topics based on logging, but Pose/Odometry/LaserScan messages along with Transforms. This allowed the SLAM_toolbox to use the data from the Crazyflie itself to create a map, which you can see an example of in this tutorial.

Moreover, for the navigation it was important that incoming Twist messages either from keyboard or from a navigation toolkit were handled properly. Most of these packages assume a 2D non-holonomic robot, but a quadcopter like the Crazyflie needs to first take off, stay in the air and land. Therefore in the examples, a separate node (vel_mux.py) was written to receive incoming Twist messages, first have the Crazyflie take off in high level commander, and keep sending hover commands to keep it in the air until a land service is called.

What’s next?

As you probably noticed, the project is still under development, but at least it is now at a good state that we feel comfortable to presented at the upcoming ROScon :) We also want to include an more official simulation package, especially now that the Crazyflie has recently became part of the official release of Webots 2022b, but we are currently waiting on the webots_ros2 to be released in the ubuntu packages. Moreover, the idea is to provide multiple simulation backends that based on the requirement of the topic (swarms, vision-based etc), the user can select the simulation most useful for their situation. Also, we would like to even out the missing items (trajectory handling, logging) in both the cflib and cpp backend of the crazyflie_server so that they can be used interchangeably. Also, I saw that the experimental simple mapper node has been featured on social media, so perhaps we should be converting that to Crazyswarm2 as well :)

So once we got the most of the above mentioned issues out the way, that will be the time that we can start discussing the official release of a ROS2 Crazyflie package with its source code residing in the Crazyswarm2 repository. In the meantime, it would be awesome that anybody that is interested in ROS2, or want to soon upgrade their Crazyswarm(1) packages to ROS2 to give the package a whirl. The more people that are trying it out and report bugs/proposing fixes, the more stable it becomes and closer it will come to an official release! Please join us and start any discussions on the Crazyswarm2 project github repository.

Last week we went on a nice trip to Delft, The Netherlands to attend the 22th International Mico Aerial Vehicle Conference and Competition, this time organized by the MAVlab of the TU Delft. Me (Kim), Barbara and Kristoffer went there by train for our CO2 policy, although the Dutch train strikes did made it a bit difficult for us! Luckily we made it all in one piece and we had a great time, so we will tell you about our experiences… with a lot of videos!

First Conference day

For the conference days we were placed in the main aula building, so that everybody could drop by during the coffee breaks, right next to one of our collaborators, Matěj Karásek from Flapper Drones (also see this blog post)! In the big lecture hall paper talks were going on, along with interesting keynote speeches by Yiannis Aloimonos from University of Maryland and Antonio Franchi from TU Twente.

In between the talks and coffee breaks, we took the opportunity to hack around with tiny demos, for which the IMAV competition is a quite a good opportunity. Here you see a video of 4 Crazyflies flying around a Flapperdrone, all platforms are using the lighthouse positioning system.

The Nanoquadcopter challenge

The evening of the first day the first competition was planned, namely the nanoquadcopter challenge! In this challenge the goal was to autonomously fly a Crazyflie with an AIdeck and Flowdeck as far as possible through an obstacle field. 8 teams participated, and although most did offboard processing of the AIdeck’s camera streaming, the PULP team (first place) and Equipe Skyrats (3rd place) did all the processing onboard. The most exciting run was by brave CVAR-UPM team that managed to do pass through 4 gates while avoiding obstacles, for which they won a Special Achievement Award.

During the challenge, Barbara also gave a presentation about the Crazyflie while Kristoffer build up the lighthouse positioning system in the background in a record breaking 5 minutes to show a little demo. After the challenge, there were bites and drinks where we can talk with all the teams participating.

Here there is an overview video of the competition. Also there was an excellent stream during the event if you would like to see all the runs in detail + presentations by the teams, you’ll have have a full 3 yours of content, complemented by exciting commentary of Christophe de Wagter and Guido de Croon from the MAVlab. Thanks to all the teams for participating and giving such a nice show :)

The overview video
The full stream of the nanoquadcopter challenge

The Green House Challenge

On Wednesday, we were brought to Tomato world, which is a special green house for technology development in horticulture. Here is where the Greenhouse challenge, which was the 2nd indoor drone competition took place. The teams had to participate with their drone of choice to navigate through rows of tomato plants and find the sick variant. Unfortunately we could not be up close and personal as with the nanoquadcopter challenge, but yet again there was a great streaming service available so we were able to follow every step of the way, along with some great presentations by Flapper drones and PATS! drones among others. For the later we were actual challenged to an autonomous drone fight! Their PATS-x system is made and detect pest insects that are harmful for green house crops, so they wanted to see if they can catch a Crazyflie. You can see in the video here that they manage to do that, and although the Crazyflie lived, we are pretty sure that a real fly or moth wouldn’t. Luckily it was a friendly match so we all had fun!

PATS versus Crazyflie battle

Here is an overview of the Green house challenge. At the end you can also see a special demo by the PULP team successfully trying out their obstacle avoiding Crazyflie in between the tomato plants. Very impressive!

Last days and final notes

Due to the planned (but later cancelled) train strikes in the Netherlands, the full pack were not able to attend the full event unfortunately. In the end Barbara and I were able to experience the outdoor challenge, where much bigger drones had to carry packages into a large field outdoors. I myself was able to catch the first part of the last conference day, which included a keynote of Richard Bomprey (Royal Veterinary College), whose lab contributed to the mosquito-inspired Crazyflie flight paper published in Science 2 years ago.

We were happy to be at the IMAV this year, which marks as our first conference attendance as Bitcraze after the pandemic. It was quite amazing to see the teams trying to overcome the challenges of these competition, especially with the nanoquadcopter challenge. We would like to thank again Guido de Croon and Christophe de Wagter of the MAVlab for inviting us!

IMAV website: https://2022.imavs.org/

Crazyflie IMAV papers:

  • ‘Handling Pitch Variations for Visual Perception in MAVs: Synthetic Augmentation and State Fusion’ Cereda et al. (2022) [pdf]
  • ‘Seeing with sound; surface detection and avoidance by sensing self-generated noise‘, Wilshin et al. (2022)