crazyflie

This week we have a guest blog post from CollMot about their work to integrate the Crazyflie with Skybrush. We are happy that they have used the app API that we wrote about a couple of weeks ago, to implement the required firmware extensions!


Bitcraze and CollMot have joined forces to release an indoor drone show management solution using CollMot’s new Skybrush software and Crazyflie firmware and hardware.

CollMot is a drone show provider company from Hungary, founded by a team of researchers with a decade-long expertise in drone swarm science. CollMot offers outdoor drone shows since 2015. Our new product, Skybrush allows users to handle their own fleet-level drone missions and specifically drone shows as smoothly as possible. In joint development with the Bitcraze team we are very excited to extend Skybrush to support indoor drone shows and other fleet missions using the Crazyflie system.

The basic swarm-induced mindset with which we are targeting the integration process is scalability. This includes scalability of communication, error handling, reliability and logistics. Each of these aspects are detailed below through some examples of the challenges we needed to solve together. We hope that besides having an application-specific extension of Crazyflie for entertainment purposes, the base system has also gained many new features during this great cooperative process. But lets dig into the tech details a bit more…

UWB in large spaces with many drones

We have set up a relatively large area (10x20x6 m) with the Loco Positioning System using 8 anchors in a more or less cubic arrangement. Using TWR mode for swarms was out of question as it needs each tag (drone) to communicate with the anchors individually, which is not scalable with fleet size. Initial tests with the UWB system in TDoA2 mode were not very satisfying in terms of accuracy and reliability but as we went deeper into the details we could find out the two main reasons of inaccuracies:

  1. Two of the anchors have been positioned on the vertical flat faces of some stairs with solid material connection between them that caused many reflections so the relative distance measurements between these two anchors was bi-stable. When we realized that, we raised them a bit and attached them to columns that had an air gap in between, which solved the reflection issue.
  2. The outlier filter of the TDoA2 mode was not optimal, a single bad packet generated consecutive outliers that opened up the filter too fast. This issue have been solved since then in the Crazyflie firmware after our long-lasting painful investigation with changing a single number from 2 to 3. This is how a reward system works in software development :)

After all, UWB was doing its job quite nicely in both TDoA2 and TDoA3 modes with an accuracy in the 10-20 cm level stably in such a large area, so we could move on to tune the controller of the Crazyflie 2.1 a bit.

Crazyflies with Loco and LED decks

As we prepared the Crazyflie drones for shows, we had the Loco deck attached on top and the LED deck attached to the bottom of the drones, with an extra light bulb to spread light smoothly. This setup resulted in a total weight of 37g. The basic challenge with the controller was that this weight turned out to be too much for the Crazyflie 2.1 system. Hover was at around 60-70% throttle in average, furthermore, there was a substantial difference in the throttle levels needed for individual motors (some in the 70-80% range). The tiny drones did a great job in horizontal motion but as soon as they needed to go up or down with vertical speed above around 0.5 m/s, one of their ESCs saturated and thus the system became unstable and crashed. Interestingly enough, the crash always started with a wobble exactly along the X axis, leading us to think that there was an issue with the positioning system instead of the ESCs. There are two possible solutions for this major problem:

  1. use less payload, i.e. lighter drones
  2. use stronger motors

Partially as a consequence of these experiments the Bitcraze team is now experimenting with new stronger models that will be optimized for show use cases as well. We can’t wait to test them!

Optimal controller for high speeds and accurate trajectory following

In general we are not yet very satisfied with any of the implemented controllers using the UWB system for a show use-case. This use-case is special as trajectory following needs to be as accurate as possible both in space and time to avoid collisions and to result in nice synchronized formations, while maximal speed both horizontally and vertically have to be as high as possible to increase the wow-effect of the audience.

  • The PID controller has no cutoffs in its outputs and with the sometimes present large positioning errors in the UWB system controller outputs get way too large. If gains are reduced, motion will be sluggish and path is not followed accurately in time.
  • The Mellinger and INDI controllers work well only with positioning systems of much better accuracy.

We stuck with the PID controller so far and added velocity feed forward terms, cutoffs in the output and some nonlinearity in case of large errors and it helped a bit, but the solution is not fully satisfying. Hopefully, these modifications might be included in the main firmware soon. However, having a perfect controller with UWB is still an open question, any suggestions are welcome!

Show specific improvements in the firmware

We implemented code that uploads the show content to the drones smoothly, performs automatic preflight checking and displays status with the LED deck to have visual feedback on many drones simultaneously, starts the show on time in synchrony with all swarm members and handles the light program and show trajectory execution of the show.

These modifications are now in our own fork of the Crazyflie firmware and will be rewritten soon into a show app thanks to this new promising possibility in the code framework. As soon as Skybrush and Crazyflie systems will be stable enough to be released together, we will publish the related app code that helps automating show logistics for every user.

Summary

To sum it up, we are very enthusiastic about the Crazyflie system and the great team behind the scenes with very friendly, open and cooperative support. The current stage of Crazyflie + Skybrush integration is as follows:

  • New hardware iterations based on the Bolt system that support longer and more dynamic flights are coming;
  • a very stable, UWB-compatible controller is still an open question but current possibilities are satisfying for initial tests with light flight dynamics;
  • a new Crazyflie app for the drone show case is basically ready to be launched together with the release of Skybrush in the near future.

If you are interested in Skybrush or have any questions related to this integration process, drop us an email or comment below.

Only a week left until we stand in our ICRA booth in Montreal and give you a gimps of what we do here at Bitcraze. As we have been writing about earlier we are aiming to run a fully automated demo. We have been fine tuning it over the last couple of days and if something unpredictable doesn’t break it, we think it is going to be very enjoyable. For those that are interested in the juicy details check out this informative ICRA 2019 page, but if you are going to visit, maybe wait a bit so you don’t get spoiled.

Apart from the demo we are also going to show our products as well as some new things we are working on. The brand new things include:

AI-deck, Active marker deck and Lighthouse-4 deck
  • AI-deck: This is a collaborative product between GreenWaves Technologies, ETH Zurich and Bitcraze. It is based on the PULP-shield that the Integrated and System Laboratory has designed. You can read more about it in this blog post. The difference with the PULP-shield is that we have added a ESP32, the NINA-W102 module, so that video can be streamed over WiFi. This we hope will ease development and add more use cases.
  • Active marker deck: Another collaboration, but this time with Qualisys. This will make tracking with their motion capture cameras easier and better. Some more details in this blog post. Qualisys will have the booth just next to us were it will be possible to see it in a live demo!
  • Lighthouse-4 deck: Using the Vive lighthouse positioning system this deck adds sub-millimeter precision to the Crazyflie. This is the deck used in the demo and could become the star of the show.

Adding to the above we will of course also display our recently released products:

  • Crazyflie 2.1: The Crazyflie 2.1 is an improvement of the Crazyflie 2.0 but keeping backward capability.
    • Better radio performance and external antenna support: With a new radio power amplifier we’ve improved the link quality and added support for dual antennas (on-board chip antenna and external antenna via u.FL connector)
    • Better power button: We’ve gotten feedback that the power button breaks too easily, so now we’ve replaced with a more solid alternative.
    • Improved battery cable fastening: To avoid weakening of the cables over time they are now run through a cable relief.
    • Improved sensors: To make the flight performance better we’ve switched out the IMU and pressure sensor. The new Crazyflie uses the drone specialized sensor combo BMI088 and BMP388 by Bosch Sensortech.
  • Flow deck v2: The Flow deck v2 has been upgraded with the new ST VL53L1x which increases the range up to 4 meters
  • Z-ranger deck v2: The Z-ranger v2 deck has been upgraded with the new ST VL53L1x which increases the range up to 4 meters
  • Multi-ranger deck: The Multi-ranger deck adds VL53L1x sensors in all directions for mapping and obstacle avoidance.
  • MoCap marker deck: The motion capture deck with support for easily attachment of passive markers for motion capture camera tracking.
  • Roadrunner: The Roadrunner is released as early access and the hardware is basically a Crazyflie 2.1 without motors and up to 12V input power. This enables other robots or system to use the loco positioning system.

You can find us in booth 101 at ICRA 2019 (in Montral, Canada), May 20 – 22. Drop by and say hi, check out the products & demo and tell us what you are working on. We love to hear about all the interesting projects that are going on. See you there!

Hi everyone, here at the Integrated and System Laboratory of the ETH Zürich, we have been working on an exciting project: PULP-DroNet.
Our vision is to enable artificial intelligence-based autonomous navigation on small size flying robots, like the Crazyflie 2.0 (CF) nano-drone.
In this post, we will give you the basic ideas to make the CF able to fly fully autonomously, relying only on onboard computational resources, that means no human operator, no ad-hoc external signals, and no remote base-station!
Our prototype can follow a street or a corridor and at the same time avoid collisions with unexpected obstacles even when flying at high speed.


PULP-DroNet is based on the Parallel Ultra Low Power (PULP) project envisioned by the ETH Zürich and the University of Bologna.
In the PULP project, we aim to develop an open-source, scalable hardware and software platform to enable energy-efficient complex computation where the available power envelope is of only a few milliwatts, such as advanced Internet-of-Things nodes, smart sensors — and of course, nano-UAVs. In particular, we address the computational demands of applications that require flexible and advanced processing of data streams generated by sensors such as cameras, which is beyond the capabilities of typical microcontrollers. The PULP project has its roots on the RISC-V instruction set architecture, an innovative academic and research open-source architecture alternative to ARM.

The first step to make the CF autonomous was the design and development of what we called the PULP-Shield, a small form factor pluggable deck for the CF, featuring two off-chip memories (Flash and RAM), a QVGA ultra-low-power grey-scale camera and the PULP GAP8 System-on-Chip (SoC). The GAP8, produced by GreenWaves Technologies, is the first commercially available embodiment of our PULP vision. This SoC features nine general purpose RISC-V-based cores organised in an on-chip microcontroller (1 core, called Fabric Ctrl) and a cluster accelerator of 8 cores, with 64 kB of local L1 memory accessible at high bandwidth from the cluster cores. The SoC also hosts 512kB of L2 memory.

Then, we selected as the algorithmic heart of our autonomous navigation engine an advanced artificial intelligence algorithm based on DroNet, a Convolutional Neural Network (CNN) that was originally developed by our friends at the Robotic and Perception Group (RPG) of the University of Zürich.
To enable the execution of DroNet on our resource-constrained system, we developed a complete methodology to map computationally-intense deep neural networks on the PULP-Shield and the GAP8 SoC.
The network outputs two pieces of information, a probability of collision and a steering angle that are translated in dynamic information used to control the drone: respectively, forward velocity and angular yaw rate. The layout of the network is the following:

Therefore, our mission was to deploy all the required computation onboard our PULP-Shield mounted on the CF, enabling fully autonomous navigation. To put the problem into perspective, in the original work by the RPG, the DroNet CNN enabled autonomous navigation of big-size drones (e.g., the Bebop Parrot). In the original use case, the computational power and memory was not a problem thanks to the streaming of images to a remote base-station, typically a laptop consuming 30-100 Watt or more. So our mission required running a similar workload within 1/1000 of the original power.
To make this work, we combined fixed-point arithmetic (instead of “traditional” floating point), some minimal modification to the original topology, and optimised memory and computation usage. This allowed us to squeeze DroNet in the ultra-small power budget available onboard. Our most energy-efficient configuration delivers 6 frames-per-second (fps) within only 64 mW (including all the electronics on the PULP-Shield), and when we push the PULP platform to its limit, we achieve an impressive 18 fps within just 3.5% of the total CF’s power envelope — the original DroNet was running at 20 fps on an Intel i7.

Do you want to check for yourself? All our hardware and software designs, including our code, schematics, datasets, and trained networks have been released and made available for everyone as open source and open hardware on Github. We look forward to other enthusiasts contributions both in hardware enhancement, as well as software (e.g., smarter networks) to create a great community of people interested in working together on smart nano-drones.
Last but not least, the piece of information you all were waiting. Yes, soon Bitcraze will allow you to enjoy of our PULP-shield, actually, even better, you will play with its evolution! Stay tuned as more information about the “code-name” AI-deck will be released in upcoming posts :-).

If you want to know more about our work:

Questions? Drop us an email (dpalossi at iis.ee.ethz.ch and fconti at iis.ee.ethz.ch)

 

icon_berlin_dt

The Maker Faire Berlin is coming up and we are starting to get ready for showtime!

The last couple of weeks has been really busy getting ready for the Maker Faire Berlin. The plan is to show multiple Crazyflies flying autonomously enabled by the Loco positioning system. To spice up the experience of autonomous flight and to inspire the visitors to imagine future applications we have made a small light and sound show where the Crazyflie is dancing to a soundtrack Kristoffer made.

Here is a teaser where we are maybe stretching the limits a bit too far ;-):

Taking the opportunity to exhibit what we do at events like the Maker Faire Berlin is really exciting and we are looking forward to hanging out with cool people and getting feedback about what we do.

So come and visit us at Maker Faire Berlin is Sept 30 to Oct 2 at Station Berlin. You will find us in hall 3, stand 149.

See you there!

icon_Berlin_dt-150x150@2x

Maker Faire Berlin 2016 is coming up and we will be there to show of the latest and greatest from our lab. The plan is to show:

  • the Loco Positioning system and autonomous flight
  • a (small) swarm – we hope to get more than one Crazyflie in the air at the same time, even though the space is very limited
  • there are ideas of some sort of synchronized sound- and light show with an autonomous Crazyflie, all controlled from a MIDI sequencer
  • a virtual cage to stop a Crazyflie from escaping when flown manually
  • a preview of the Arduino based controller to fly the Crazyflie
  • a Loco Positioning tag that can be used to track other devices/robots than the Crazyflie
  • ROS integration with the Crazyflie and Loco Positioning system

We’re not sure we will succeed with them all, but let’s hope for the best. Is there anything special you are interested in? Please let us know!

Maker Fair Berlin is Sept 30 to Oct 2 at Station Berlin. You will find us in hall 3, stand 149.

We love your feedback and are looking forward to meeting you all and talk about your projects, what we do and future ideas.

See you in Berlin!