Category: Software

I’ve spent the last 5 years of my career at Microsoft on the team responsible for HoloLens and Windows Mixed Reality VR headsets. Typically, augmented reality applications deal with creating and manipulating digital content in the context of real-world surroundings. I thought it’d be interesting to explore some applications of using an augmented reality device to manipulate and control physical objects and have them interact with the real world and/or digital content.

Phase 1: Gesture Input

The HoloLens SDK has APIs for consuming hand gestures as input. For the first phase of this project, I modified the existing Windows UAP/UWP client to handle these gestures and convert them to CRTP setpoints. I used the “manipulation gesture” which provides offsets in three dimensions for a tap-and-drag gesture, from the point in space where the initial tap occurred. These three degrees of freedom are mapped to thrust, pitch and roll.

For the curious, there’s an article on my website with details about the implementation and source code. Here’s a YouTube video where I explain the concept and show a couple of quick demos.

As you can see in the first demo in the video, this works but isn’t entirely useful or practical. The HoloLens accounts for head movements (otherwise moving the head to the left would produce the same offset as moving the hand to the right, requiring the user to keep his or her head very still) but the user must still take care to keep the hand in the field of view of the device’s cameras. Once the gesture is released (or the hand goes out of view) the failsafe engages and the Crazyflie drops to the ground. And of course, lack of yaw control cripples the ability to control the Crazyflie.

Phase 2: Position Hold

Adding a flow deck makes for a more compelling user experience, as seen in the second demo in the video above. The Crazyflie uses the sensors on the flow deck to hold its position. With this functionality, the user is free to move about the room and make shorter “adjustment” hand gestures, instead of needing to hold very still. In this mode, the gesture’s degrees of freedom map to an x/y velocity and a vertical offset from the current z-depth.

This is a step in the right direction, but still has limitations. The HoloLens doesn’t know where it is in space relative to the Crazyflie. A gesture in the y axis relative to the device will always result in a movement in the y direction of the Crazyflie, which begins to feel unnatural if the user moves around. Ideally, gestures would cause the crazyflie to move in the same direction relative to the user, not relative to the ‘front’ of the Crazyflie. Also, there’s still no control over yaw.

The flow deck has some limitation as well: The z-range only goes to 2 meters with any accuracy. The flow sensor (for lateral stabilization) has a strong dependency on the patterns on the floor below. A flow sensor is a camera that relies on measuring pixel deltas from frame to frame, so if the floor is blank or has a repeating pattern, it can be difficult to hold position properly.

Despite these limitations, using hand gestures to control the Crazyflie with a flow deck installed as actually quite fun and surprisingly easy.

Phase 3 and Beyond: Future Work & Ideas

I’m currently working on some new features that I hope will open the door for more interesting applications. All of what follows is a work in progress, and not yet implemented or functional. Dream with me!

Shared Coordinate System

The next phase (currently a work in progress) is to get the HoloLens and the Crazyflie into a shared coordinate system. Having spatial awareness between the HoloLens and the Crazyflie opens up some very exciting scenarios:

  • The orientation problem could be improved: transforms could be applied to gestures to cause the Crazyflie to respond to commands in the user’s frame of reference (so ‘pushing’ away from one’s self would cause the Crazyflie to fly away from the user, instead of whatever direction is ‘forward’ to the Crazyflie’s perspective).
  • A ‘follow me’ mode, where the crazyflie autonomously follows behind a user as he or she moves throughout the space.
  • Ability to walk around and manually set waypoints by selecting points of interest in the environment.

The Loco Positioning System is a natural fit here. A setup step (where a spatial anchor or similar is established at same physical position and orientation as the LPS origin) and a simple transform for scale and orientation (HoloLens and the Crazyflie define X,Y,Z differently) would allow the HoloLens and Crazyflie to operate in a shared coordinate system. One could also use the webcam on the HoloLens along with computer vision techniques to track the Crazyflie, but that would require constant line of sight from the HoloLens to the Crazyflie.

Obstacle Detection/Avoidance

Example surface map produced by HoloLens

The next step after establishing a shared coordinate system is to use the HoloLens for obstacle detection and avoidance. The HoloLens has the ability to map surfaces in real time and position itself in that map (SLAM). Logic could be added to the HoloLens to consume this surface map and adjust pathing/setpoints to avoid these obstacles without reducing the overall compute/power budget of the Crazyflie itself.

Swarm Control and Manipulation

As a simple extension of the shared coordinate system (and what Bitcraze has been doing with TDoA and swarming lately) the HoloLens could be used to manipulate individual Crazyflies within a swarm through raycasting (the same technique used to gaze at, select and move specific holograms in the digital domain). Or perhaps a swarm could be controlled to move out of the way as a user passes through the swarm, and return to formation afterward.

Augmenting with Digital Content

All scenarios discussed thus far have dealt with using the HoloLens as an input and localization device, but its primary job is to project digital content into the real world. I can think of applications such as:

  • Games
    • Flying around through a digital obstacle course
    • First person shooter or space invaders type game (Crazyflie moves around to avoid user or fire rendered laser pulses at user, etc)
  • Diagnostic/development tools
    • Overlaying some diagnostic information (such as battery life) above the Crazyflie (or each Crazyflie in a swarm)
    • Set or visualize/verify the position of the LPS nodes in space
    • Visualize the position of the Crazyflie as reported by LPS, to observe error or drift in real time

Conclusion

There’s no shortage of interesting applications related to blending augmented reality with the Crazyflie, but there’s quite a bit of work ahead to get there. Keep an eye on the Bitcraze blog or the forums for updates and news on this effort.

I’d love to hear what ideas you have for combining augmented reality devices with physical devices like the Crazyflie. Leave a comment with thoughts, suggestions, or any other relevant work!

We have been flying swarms in our office plenty of times. There is kind of a limitation to this though, our flying space is only around 4 x 4 meters. Flying 8 – 10 Crazyflies in this space is challenging and it is hard do make it look good. Slight position inaccuracy makes it look a bit sloppy. To mitigate this we decided to have a small swarm show using a a bigger flying space and to invite families and friends, just to raise the stake a bit.

As usual we had limited time to accomplish this, and this time the result should be worth looking at. Well, we have managed to pull off hard things in one day before so why not this time… The setup is basically a swarm bundle with added LED-rings. Kristoffer took care of the choreography, Tobias setting up the drones and Arnaud configuring the Loco positioning system.

Choreography

Kristoffers pre-Bitcraze history involves some dancing and he has been playing a bit earlier with the idea of creating choreographies with Crazyflies. One part of this was a weekend-hack a few months back when he tried to write a swarm sequencer that is a bit more dance oriented. The goal was to be able to run a sequence synchronized to music and define the movements in terms of bars and beats rather than seconds. He also wanted to be able to define a motion to end at a specific position at a beat as opposed to start on the beat. As Kristoffer did not have access to a swarm when he wrote the code he also added a simple simulator to visualize the swarm. The hack was not a complete success at that time but turned out to be useful in this case.

The sequences are defined in a YML file as a list of time stamps, positions and, if needed the color of the LED-ring. After a few hours of work he had at least some sort of choreography with 9 Crazyflies moving around, maybe not a master piece from a dance point of view but time was running out.

The simulator is super basic but turned out to be very useful anyway (the color of the crosses indicates the color of the LED ring). We actually never flew the full sequence with all drones before the performance, but trusted the simulation to be accurate enough! We did fly most of the sequence with one Crazyflie, to at least make it plausible that we got it all right.

Short snippet from the simulation

Setting up drones

Handling swarms can be tedious and time consuming. Just making sure all drones are assembled, fully operational and charged is a challenge when the number increases. Tobias decided to do manual flight test of every drone. If it flies well manually it will most likely fly well autonomously.  The testing resulted in switching out some motors and props as vibrations is a crippling factor, especially for Z accuracy. Takeaway from this exercise is to implement better self testing so this can be detected automatically and fixed much quicker.

Loco Positioning System

We ran the positioning system with standard firmware in TDoA mode to support multiple Crazyflies simultaneously. The mapped space was around 7 x 5 x 2.5 meters and the anchors were placed more or less in the corners of the flying space box.

The result

The audience (families and friends) was enthusiastic and expectations high! Even though not all drones made it all the way through the show, the spectators seemed to be duly impressed and requested a re-run.
 

This is a guest blog post written by Fred, the maintainer of the Android Crazyflie client and Java Crazyflie lib.

As a follow-up to last week’s blog post about the different Crazyflie clients, I would like to describe the current status of the Android client in a bit more detail.

After more than a year, Version 0.7.0 of the Android client has been released last Friday (March 16). Most importantly this release fixes a very annoying UI bug that appeared on newer Android versions, where the on-screen joystick did not show up when the app was started for the first time. It also adds support for height hold mode when using the zRanger or Flow deck, it adds a console view (can be enabled in Settings -> App settings -> Show console) and also shows the link quality for BLE connections now. You can read the full changelog on Github. You can find the release in the Google Play Store and as an APK on GitHub.

Connection quality and console messages now work on a BLE connection

Apart from the obvious/visible new features and bug fixes, quite a lot has happened “under the hood”. Some parts of the code were cleaned up, refactored, decoupled, etc. This is still a work in progress though.

There is still plenty of stuff to do for future releases, especially in the realm of Bluetooth support. On the short list are:

  • Param/Log support for BLE connections
  • bootloading over BLE
  • support for Flow deck sequences

Admittedly there was almost no documentation for the Android client and some features are hidden (too well). In an effort to change that, I’ve started to document some features on the project’s Github wiki.

If you find bugs in the app, want to request a feature or see errors in the documentation, please open a GitHub issue.

If you are interested in the development of the Crazyflie Android client and want to get involved, let me know. The fastest way to get new features added or bugs fixed is to contribute a pull request.

Last but not least, I’d like to thank the Bitcraze team for creating and developing the Crazyflie and amazing new decks. Maintaining the Crazyflie Android client is still one of my favorite past times. :)

We though we could use this Monday blog post to do a small state of the Crazyflie clients. What we call a Crazyflie client is a piece of software that connects a Crazyflie and allows to control it and get telemetry back from it. In this post we will concentrate on single-crazyflie client we have on our GitHub page, there exists a lot of libraries and software to control one or many Crazyflies, we will write another blog post about them.

Crazyflie PC client

The Crazyflie PC client, is what we consider the reference client. It supports connecting one Crazyflie using the Crazyradio (PA) dongle or direct USB connection to Crazyflie 2.0. It supports the full Crazyflie telemetry (ie. log), parameters (ie. params) and firmware update. It has support for all the Crazyflie 2.0 deck that can use client support. It is updated each time it is needed when new functionalities are added in the Crazyflie which makes it actively developed and maintained by the community and Bitcraze. A bluetooth link has not been prioritize so far since its multi-platform implementation is non-obvious and bluetooth will introduce some latency and lower the radio bandwidth compared to Crazyradio. However, if anyone would want bluetooth support for the Crazyflie PC client, we welcome contributions :-). The Crazyflie PC client is using the crazyflie-lib-python to communicate with the Crazyflie.

We have three mobile clients on our Github. They have various level of functionality depending on community involvement. Our philosophy is to have the mobile clients at least able to control a Crazyflie, this allows to use them to test Crazyflies without requiring to setup a computer. We will help and support anyone that is interested in adding functionalities to the mobile clients but we generally do not have time to add much functionalities by ourselves.

The Andoid crazyflie client is currently maintained by Fred from the community. It is mobile Crazyflie client with the most feature. It supports both Crazyradio and Bluetooth link. Using Crazyradio it currently supports the part of telemetry and parameter required to support a couple of deck like the led-ring and buzzer deck and supports updating the firmware. Using bluetooth there is currently no telemetry, parameter or firmware update functionality so no deck support. Development is in progress to support more decks and to bring the bluetooth link to the same functionality as the Crazyradio link. The Android client is written in Java and Fred has developed a Crazyflie Java library that is used in the Android client but that can also be used in any other Java program.

 

Crazyflie Android client

The iOS Crazyflie client, works on iPhone and iPad. It supports bluetooth link. It does not have any telemetry or parameter support, so no deck control support. It has firmware update support over bluetooth. It has mainly been developed by me with great contributions from the community for, among others, the port to swift.  The iOS client is written in swift. The Crazyflie and Bluetooth part of the code could be a good starting point if anyone wanted to make a native mac Crazyflie client.

Crazyflie iOS client

Finally we have a prototype of a Windows UWP client developed by theseankelly. It supports Bluetooth low energy. It currently does not supports any telemetry or parameters. It is working both on Windows phone and on Windows 10 on computer, it is currently the only way to connect a Crazyflie using Bluetooth from a laptop. The windows client supports manual control of the Crazyflie using a gamepad or with gesture using HoloLens. This original set of functionality makes it both the most simple and the most advanced Crazyflie client :-).

If you are interested in developing for any of these client, of by making your own, feel free to make a ticket on the relevant github repo or open a thread in the forum. We migh not have much time to develop for the mobile clients, but we will always be glad to help and guide anyone that wants to implement software in relation with the Crazyflie. The Crazyflie clients (running in a computer or phone) and the Crazyflie firmwares (running in the Crazyflie itself) are open source and in active development, it means that is possible to modify both side, this makes it a great target to experiments and to play around with new ideas :-).

 

Qualisys is a Motion Capture (Mocap) system manufacturer based in Gothenburg in Sweden. Since we are also based in Sweden, Qualisys have been able to visit us a couple of times and we now have one of their Motion capture system installed at the office. This collaboration allows us to have access to a Mocap system, something we did not have previously. It means that we can better support people using motion capture systems with Crazyflie.

We are currently implementing support for Qualisys in the Crazyswarm project. Crazyswarm currently supports a couple of motion capture system including Vicon and Optitrack, with the addition of Qualisys we and everyone with a Qualisys system will be able to fly swarms of Crazyflie in their mocap.

We are also planning on having a combined booth, Bitcraze and Qualisys, at IROS 2018. We are planning to demo flying both with the Mocap and with the Loco Positioning System. We will keep updating on this when we have more details.

We look forward to this collaboration since it will allow us to use and better support motion capture positioning for the Crazyflie.

We have seen a big interest in flying swarms of Crazyflies and there are many challenges in doing so. The USC ACT Lab has developed Crazyswarm, a collection of software and firmware that allows to fly big swarms of Crazyflie using a motion capture system. This project has been used by USC and other universities to fly the most impressive swarms of Crazyflie 2.0 to date. 

Picture from “Downwash-Aware Trajectory Planning for Large Quadrotor Teams” publication using Crazyswarm

We are very happy that we together with Wolfgang and James, the main developer of Crazyswarm, have started to merge the firmware part into the official Crazyflie firmware. Merging the code will have two great consequences: people will be able to use Crazyswarm with a Crazyflie 2.0 running the stock firmware and everybody else will be able to use functionalities that has been developed for Crazyswarm.

There is currently a couple of parts that are in the works. The state controller has been merged already. There is currently some discussion on Github on how to merge the high-level commander, a commander that would allow the Crazyflie to autonomously follow trajectories as well as other high level commands. Finally there will be some work required to adapt the Kalman filter to make it more suited to accepts measurements from a motion capture system. The Crazyflie was not developed as an autonomous platform from the beginning but it is becoming one in big part thanks to the great contributions from the community.

A great thanks to James and Wolfgang for their effort in merging CrazySwarm in the Crazyflie code-base!

Out of stock
Unfortunately we miscalculated how much China slows down during Chinese new year which has caused some products to become out of stock. One of them is the Crazyradio PA which is also causing some bundles to become out of stock as well. The good news is that the products are in transit to the warehouse and will hopefully be back in stock any day now. Until then you can use the “Item out of stock – notify me!” functionality to get notified as soon as the product is back in stock.

 

We just released a new version of the Bitcraze VM, version 2018.01. Nothing very new in this version, the VM has been rebuilt so that all the projects included in it are now up-to-date. This solves an issue where the Crazyflie client was blocked in the previous revision.

The current VM is running a quite old version of Ubuntu, the 14.04 LTS version. We are planning at refreshing the VM by making a new one when Ubuntu 18.04 LTS is released.

Since the Crazyflie 1 time we have been documenting the VM as a standard development environment. This has a couple of advantages:

  • We can distribute a fully setup development environment that has minimal dependencies with the host system
  • If someone has a problem with the VM, there is a bit chance we can reproduce and fix it, everyone is running the same system
  • Everything is pre-setup so it should be fairly quick to get started with the actual firmware or software development

However the VM solution also has drawbacks:

  • It requires to install and somewhat configure VirtualBox or other virtual machine software
  • It has some cost in performance, mostly for USB as it slows down the communication with the Crazyflie
  • The USB implementation seems to have bugs on Windows, which makes the communication with the Crazyflie buggy. This is currently the biggest problem!

So, the situation is not ideal, and we would love to get some feedback from the community.

There are two very different parts in the system: the lib and client in Python, and the firmwares in C.

  • Starting development of the python parts, on Windows/Mac/Linux, is fairly straightforward. Basically one has to install python and git, clone the projects, install dependencies and it runs. Different python IDEs can be used and work pretty much out of the box.
  • Starting development for the embedded C part can be a bit more challenging. On Linux and Mac it is pretty easy since it only requires to download the arm-embedded-gcc compiler and adding it to the path. On windows things are a bit more complex because you also need Make and I haven’t yet figured-out the best way to install that. Having an IDE requires to configure Eclipse CDT.

What do you think about the VM as a development environment and would you prefer other solutions like documentation for each operating system on how to install a development environment?

 

We have been writing a couple of times already about the new TDoA2 algorithm for the Loco Positioning System. A TDoA mode has been experimental from the day we released the LPS but we are now proud to announce that TDoA is an official positioning mode for the Loco Positioning System and the Crazyflie.

Practically it means that the Loco Positioning System now has an officially supported mode to locate and fly a swarm of Crazyflie 2.0.

We have worked these last weeks at updating documentation, the “Getting started” tutorial and releasing all the affected firmware and software. One of our goals was to make the new TDoA mode as seamless and as easy as possible to work with, this meant having everything working without having to recompile the Crazyflie or any other part of the system. The Crazyflie is now detecting the LPS mode automatically and it is possible to configure the anchors position and ranging mode remotely from the within Crazyflie client LPS tab.

What we have just released is:

If you have 8 anchors and want to convert your local positioning system to TDoA, this can be done very easily by following the new version of the getting started with loco positioning system guide.

If you want more information about the different positioning modes, we have also updated the system description.

 

This week we have a guest blog post by Ben, enjoy!

I’m Ben Kuperberg and i’m a digital artist, artist-friendly software developer and orchestra conductor. Being a juggler, I’ve decided to focus some of my work on the intersection between juggling and technology, and i’ve since been working more and more with jugglers, my last project being “Sphères Curieuses” from Le Cirque Inachevé, created by Antoine Clée. While the whole project is not focused on drones, a part of it involves synchronized flight of multiple drones and precise human interaction with those drones. Swarm flight is something already out there and some solutions already exist but the context of this project added some challenges to it.

Most work on drone swarms have been done by research group or school. They use high-grade expensive motion-capture system able to track precisely the drones and able to assign their absolute positions. While the quality of the result is undeniable, it’s not fit for stage shows : the setup is taking a lot of time which we can’t always have when the show is on the road. Moreover, the mocap system is too invasive for the stage if you want to be able to “hide” a bit the technology and let the spectator focus on what the artist wants you to see. Not to mention it costs an arm and a leg and Antoine needs both to juggle.

So we had to find other ways to be able to track multiple drones. That’s when we found out the [amazing] team at Bitcraze was working on the TDoA technology, which allows precise-enough tracking of a virtually unlimited amount drones, at reduced cost and with a fast and clean setup.

After some work we managed to have a first rough version of our swarm server made by Maxime Agor that allowed to connect and move multiple drones using the TDoA system, controlled from a Unity application.

While we were able to present a decent demo with this system, we were facing a major problem of reactivity. When working with artists and technology, reactivity is a key component to creativity. Because it can be frustrating and tense to stop each 2 minutes to make changes or fix problems. My first priority was therefore to prepare and design softwares that will allow me to spend most of the “creation time” on the actual creation aspect and not on technical parts. It is also essential that the artist performing in front of the audience can entirely focus on the performance and by fully confident in this technology. The last challenge is that as I focus my work on the creation and not touring, all my work needs to be easily understood and modified by both the artists and the technicians who will take over my work for the tour.

With all of that in mind, I decided to create a software with a high-end user interface called “La Mouche Folle” (« The Crazy Flie » in french) that allows to control multiple drones and have an overview of all the drones, their battery/charging/alert states, auto-connect / auto-reboot features, external control via OSC, and a Unity client to view and actually decide how to move the drones. All my work is open-source, so you can find the software on github.

There only is a Windows release for now but it should compile just fine on OSX and Linux, the software is made with JUCE, depends on OrganicUI and lib-usb. Feel free to contact me if you want more information on the software. Many thanks Wolfgang Hoenig for the support and the great work on the crazyflie cpp library i’m relying on.

So this is the basic setup of our project, but we needed more than that to control the drone. We wanted to be able to control them in the most natural way possible. We quickly decided to go with glove-base solutions, and have been working with Specktr to get our hands – pun intended – on developer versions of the glove. The glove is good but can’t give us absolute position of the hand, so we added HTC Vive trackers with the lighthouse technology and then were able to get both natural hand control and sub-millimeter precision of the tracked hand.

Then it was a matter of connecting everything together : for other projects for Theoriz Studio, I already developed MrTracker (used in the MixedReality project) that acts as a middleware between the Vive trackers and Unity.

I used Chataigne to easily connect and route the Specktr Glove data to Unity as well so we would have maximum flexibility to switch hardware or technology without breaking whole setup if we needed to.

 
A video of the final result
 

 

In the past years, i’ve come to work on a lot of different projects, with different teams, which i like very much, because each project leads to discover new people, new ways of working and new challenges to overcome. I’m having a great time working on this project and especially sharing everything with the guys at Bitcraze and the community, everyone has been so cool and nice. I’ve planned to go at the Bitcraze studios to work for few weeks with them and i’m sure it’ll be a great experience !

It is now the first day in 2018 and a good day to look back at 2017. Its been a busy year as always and we have had a lot of fun during the year. One of the first things popping up is that things takes so much longer then we think. Luckily we are working with open source and the progression is not only dependent on us as we have awesome help from the community. We are already really excited about what’s coming in 2018, looking forward to working together with so many great people!  

Community

The Crazyflie 2.0 is still gaining attention and are becoming more and more popular among universities around the world. We see interest from researchers working with autonomous systems, control theory, multi-agent systems, swarm flight, robotics and all kinds of research fields, which is really great. This means that a lot of exciting work have been contributed by the community, so here is a small summary of what has happened in the community during the year.

In the beginning of the year the Multi-Agent Autonomous Systems Lab at Intel Labs shared how the Crazyflie 2.0 is used in their research for trajectory planning in cluttered environments. We wrote a blog post about this if you want learn more about their work. The Crazyflie showed up on the catwalk of Berlin Fashion week being part of fashion designer Maartje Dijkstras futuristic creation TranSwarm Entities”, a dress made out of 3D prints accompanied by autonomously flying Crazyflies.

For the third year Bitcraze visited Fosdem. We had a good time and got to hang out with community members like Fred how did a great presentation about what’s new in the Crazyflie galaxy. During the conference we took the opportunity to present the Loco positioning system and demo autonomous flight with the Crazyflie controlled by the Loco positioning system. In the demo we flew with the non-linear controller from Mike Hammer using trajectory generation from Marcus Greiff

We have had a few interesting blog post contributions during the year from major universities. Including a guest post written by researchers at Carnegie Mellon University. The researchers are using the Crazyflie 2.0 drone to create an adaptive multi-robot system. Similar work has been done by the researchers at the Computer Science and Artificial Intelligence Lab at MIT were they have been studying coordination of multiple robots, developing multi-robot path planning for a swarm of robots that can both fly and drive.

We have also had two interesting guest blog post from the GRASP Laboratory at University of Pennsylvania, the “A Flying Gripper based on Modular Robots” and “ModQuad – Self-Assemble Flying Structures“. Inspired by swarm behavior in nature, for instance how ants solve collective tasks, both projects explore the possibilities of how multiple Crazyflies can work together to perform different missions.

During the fall Fred took the time to pay us a visit at the office in Sweden and worked together with us. He is making great progress on the Java Crazyflie lib that is going to be used in the Android client as well as in PC clients. It will allow to connect and use a Crazyflie from any Java program, there has already been some successful experimentation done using it from Processing

Some other great news is that thanks to Sean Kelly the Crazyflie 2.0 is now officially supported by the Betaflight flight controller firmware. Betaflight is a flight controller firmware used a lot in the FPV and drone racing community.

Thanks to denis on the forum, there is now support for Crazyflie 2.0 in the PX4 flight controller firmware. PX4 is a comprehensive flight controller firmware used in research and by the industry.

Finally The Crazyswarm project, by Wolfgang Hoenig and James A. Preiss from USC ACTlab has been presented at ICRA 2017. It is a framework that allows to fly swarms of Crazyflie 2.0 using a motion capture system.  There is currently some work done on merging the Crazyswarm project into the Crazyflie master branch, this will make it even easier to fly a swarm of Crazyflie. In the meantime the project is well documented and can be used by anyone that has a couple of Crazyflies and a motion capture system.

Hardware

During 2017 we released four new products. Beginning with the Micro SD-card deck which e.g. makes high speed logging possible. Then the Z-ranger that enables a height hold flight mode up to 1m above ground. We like to call it drone surfing as that is very much what it feels like when flying. We ended by releasing two boards, Flow deck and Flow breakout, in collaboration with Pixart containing their new PMW3901 optical flow sensor. The Flow deck enables scriptable flight which is very exiting. That lead us to release the STEM drone bundle which we hope will inspire people to learn more about flying robotics.

Hardware prototypes, our favorite sub-category, are something we have plenty of lying around here at the office. To name a few, a possible Crazyradio 2, the Loco positioning tag, the Crazyflie RZR, the Glow deck or Obstacle avoidance/SLAM deck. It takes a long time making a finished product… Hopefully we will see more of these during 2018!

Software

At the same time we released the Flow deck we also released the latest official Crazyflie 2.0 FW and client (2017.06). This enables autonomous capabilities as soon as the Flow deck is inserted by automatically turning on the corresponding functionality. Just before that, the loco positioning was brought out of early access with improved documentation and simplified setup. Since then a lot of work has been put into making a release of TDoA and improving overall easy of use. With the TDoA2 and automatic anchor estimation starting to work pretty well we should not be far from a new official release!

We would like to end 2017 with a big thank you to our users and community with this compilation video. Make sure to pump up the volume!

video link