Category: Crazyflie

If you are one of the lucky ones getting a Crazyflie for Christmas we are happy to tell you there is a new and fresh “getting started” guide to help you get going :-).

Before going on holiday me and Kristoffer published an updated version for the “getting started” part of the website which we are very happy about. Besides making a new edition of the “Assembling” part we have also added “Installing on a smartphone”, “Installing on a computer” and “Flying”.

We are hoping that these new additions for the “Getting started” section will be a big help for everybody who just got a Crazyflie for Christmas and feel unsure about how to start. Also this is an additional way to help people finding out if the Crazyflie is right for them, who otherwise might feel uncertain about buying one or not.

If you have any comments or suggestions about the new “Getting started” please feel free to contribute we are always open to ideas about improvements and tweaks.

mondayPost

It’s been a hectic time here att Bitcraze before Christmas with new decks coming out and the ongoing re-design of the website among others. So we are all taking some time off during the holidays but we will be answering email and support issues. However it might take a bit longer time since we will be occupied with drinking swedish glögg, french wine and stuffing ourselves with chocolate.

postCard2

After a hectic week we’re finally ready to put some new decks into production! A couple of months ago we selected 4 deck prototypes to try to bring to production before Christmas: WiFi, GPS, BigQuad and the Buzzer. After working hard on them during the last months, we’re now ready to release the Buzzer and BigQuad decks. Last week we ordered the first batches and the product pages and descriptions are being written this week. We’ll push out more information about the boards as it gets available, so stay tuned!

Below is a few quick shots of the latest prototypes:

So what happened to the GPS and the WiFi decks? The latest prototypes are working, but there’s still some minor issues. So instead of moving to production with the current design, we’re doing one last prototype iteration and launching the boards early next year.

On a related note we’ve been working hard together with Seeedstudio to get some more Crazyflie 2.0s into stock before Christmas. Not so surprisingly we’re not the only ones rushing to produce. But thanks to lots of efforts from Seeedstudios side the Crazyflie 2.0 will be back in stock in a couple of days!

Lately we have been busy finalizing new Decks. We have a pretty long list of what we want to release and the first four to come are the bigquad deck, the Buzzer deck, Wifi (ESP8266) deck and a GPS (GlobalTop) deck. Before going further a disclaimer: we have ordered final prototype of these decks so the probability we release them is pretty high, though it is still possible we end up hitting a big bug and then some might be delayed.

The bigquad  was covered in previous post. It is a very simple deck: only connectors. It can be used to connect brushless motors ESCs to the Crazyflie in order to control a bigger quad. We have also added connectors to control the Crazyflie from a standard receiver (SPPM input), for GPS, active buzzer, battery telemetry and I2C sensors. The main use case we see for this deck is to be able to develop with the Crazyflie and then go outside and fly with bigger sensors without having to port the code to another platform.

bigquaddeck

Firmware-wise we are developing support for ESCs and SPPM input.

The Buzzer deck is the second simplest: we have ‘just’ mounted a buzzer on a deck and made the driver for it. As usual with production nothing is easy and selecting the buzzer was surprisingly hard. We wanted a low profile buzzer to be able to put other decks on top of it. We have ordered 20-ish different buzzer from DigiKey and tested all of them to select the best:

BuzzerSelect_w

buzzerdeck

The Buzzer driver will be able to play some music as well other sounds. One use case we envision for the buzzer deck is to be able to find the Crazyflie if it has crashed out of sight.

The GPS deck is an old story: we started working on a GPS deck on the Summer 2014 and we even planed to release it at the same time as the Crazyflie 2.0. Unfortunately we had lots of problems with the antenna not working properly when attached to the Crazyflie. After a lot of experimentation, spread over 1 year, we finally endeded up with a design that works: an integrated GPS receiver and patch antenna:

gpsdeck

We found the patch antenna to be much less sensitive to the Crazyflie 2.0 ground plane than the previously tested chip-antenna. As for the software part we will implement enough code to decode the NMEA strings from the GPS and makes them available via the log subsystem. We have a prototype of a new GPS tab in the client using a webview and openstreetmap, more on that on a later post.

Finally we have mounted an ESP8266 wifi module on a deck and Crazyflie 2.0 becomes Wifi enabled :-):

wifideck

So far we are planning on loading the NodeMCU Lua firmware in the ESP8266 which will allow to easily develop wifi connectivity to the Crazyflie. Note that the final board will be based on a different ESP8266 module with chip-antenna.

We will post more in-depth information about those new decks in the following weeks. We will also communicate the release date as soon as we know it.

Last week I was at Lua Workshop 2015 in Stockholm, it was a very interesting conference with lots of interesting people. I also had the opportunity to see the office of King, the host for the workshop, and it gives a lot of idea for fun stuff and toys we could have in our office :-)

On a side note we are organizing a presentation in our office in Malmö the 22nd of October: Mandy from Seeedstudio is visiting us and will talk about manufacturing in China. If you want to come you can register.

lua-logo-crazyflie

Now, back to Lua. Lua is a dynamic programming language that is small, fast and meant to be embedded within other programs. Currently is is used a lot in video games and a bit on servers. It has also be used in deeply embedded system with the eLua project, for example Seeeds sells a Lua-preloaded ESP8266 wifi module. One of our plan for Crazyflie 2.0 is to be able to write deck drivers in Lua.

With Crazyflie 2.0 we are aiming to make a research-grade flying platform more accessible and versatile, hence the expansion capabilities (with decks) and the new API we are writing for it. Lua would fit well in this goal. It would allow to very quicky script and test a device driver. As a bonus Lua being safe (ie. the virtual machine cannot crash the system), there would be no risk of crashing the copter with those kind of driver. The architecture would look something like that:

cfluaArchitecture

Though Crazyflie Lua integration has not been prioritized so far, we think it is something that would be interesting to play with it in the future. If anyone is interested into testing and helping out please reach us on Github or on the forum.

Last Thursday we went to LTH (University of Lund), to the Robotics department, to make some measurements on the ultra wide band (UWB) positioning system we are working on. The idea was to use one of their robots to move a Crazyflie around along well known path, and at the same time record as much sensor data as possible. This would give us data that we can crunch offline.

We placed four anchors around the robot and our positioning expansion deck on the Crazyflie. The Crazyflie collected the data from the positioning deck as well as its internal sensors and streamed the data it over USB to an external computer for storage. We collected the following data:

  • Distance to the anchors (raw measurements)
  • Air pressure at the anchors
  • Air pressure at the Crazyflie
  • Accelerometers
  • Gyros

The logs from the robot will give us the real position of the Crazyflie as well as the anchors, and from that we will be able to evaluate the performance of algorithms that use the sensors to figure out the position.

The dataset will be shared with the researchers at Lunds university, they have some interesting ideas they want to try out.

Next step is to crunch the data…

For more information on our UWB positioning project, see Firmware and dwm 1000 nodes

Last week and this week is busy with preparations for the New York and Berlin maker faires. Since we will be in the Seeedstudio booth we don’t have the same space as at the Bay Area Maker Faire, so we had to rebuild our “fly-cage”. The new specs are 1.7 x 0.7 x 0.7 meters. This is the area the Crazyflie 2.0 should be able to fly in for a full charge without touching the sides on the net.

We don’t have any special plans during the faire, except for flying during the day. So if you feel like meeting up, having a beer and getting lost in various technology discussions then leave a comment or drop us a mail.

The autonomous flying rig we used in bay-area was using the Kinect 2 sensor. This new rig is only using a standard webcam which is cheaper and easier to manage (ie. we do not need a Windows computer anymore). We are attaching an augmented reality marker on the top of Crazyflie and the image processing is mostly done by the ArUco library. ArUco is detecting the position of the Marker in 3D and the position is sent via zeromq to the controller. We used the same controller code as for the Kinect, we just had to tune it a bit better to keep in the smaller space. Then the controller is sending pitch/roll/yaw to the Crazyflie client setup to have a ZMQ as input device.

CPBrmPrUAAAKkEj

If you want to build the same cage then here’s a list of the parts:

  • Some kind of net (we used normal fishing net)
  • Fishing line (to tighten the cage)
  • Aluminium beam (for tents)
  • These 3D printed parts
  • Webcam with standard camera attachment (we use Logitech C920)
  • Camera attachment screw

We are in the process of cleaning up the code for the webcam. It will be pushed on Github and we will document the build on the Wiki.

We have decided to use Travis for continuous integration builds of our open source repositories. Travis is automatically building the code on all branches and pull requests, which gives all developers that wants to contribute to the project, the possibility to see that their code passes the build. The current status of the latest build on the main branch, is visible through the icon in the readme in github, or on the Bitcraze page at travis.

travis-ci

The projects we have added so far to travis are written in C or python. The C projects for instance, must be compiled with special compilers for the processors used in the crazyflie which adds some extra complexity. We have created a docker image (bitcraze/builder) with the tools needed, to make life easier for developers. If you use the image when developing, there is no need to install tools locally, and the same image is used in travis builds, so you know you will get the same results as the CI-server. This also removes the problem of tools with different versions (and results) in the development- and build environment.

To use the image you can for instance type

docker run --rm -v ${PWD}:/module bitcraze/builder make

Event though it is awesome to be able to create a well known build environment through a docker container, we feel that too much typing is needed to execute a simple make.  To solve that problem we are looking at the possibility of creating a toolbelt that will handle that for you. More information on that later on, for now developers will have to find their own solutions through scripting, aliasing or other means.

Obviously you need Docker to use this image. If you have not tried it out yet, take a look at www.docker.com.

We are aiming for automated testing of our code, and even though we have a lot of work to do, we have taken the first baby step. For the moment, firmware projects are simply compiled and linked to ensure that the code is coherent. Projects that support both crazyflie 1 and 2 are built in both flavours to avoid problems for developers that might only use one of the platforms.  The python client project is only checked for PEP8 compliance, but we are looking at how to unit test. Any input from the community is welcome!

Happy hacking!

During the last week we’ve taken a big step, moving to Python 3! The reason for the move is that Python 3 is becoming broadly adopted and it has more features that we want to make use of. Also 3 > 2. This post will explain a bit of what we did, some of the problems we encounters and the current status. The numbers 2 and 3 will be thrown around a lot in the text, but to precise we’re talking about versions 2.7+ and 3.4+ (even more precise it’s been tested on 2.7.9 and 3.4.3). The next release of the client will run on Python 3, but if you want to test it now just clone the development branch on GitHub.

Status

If you have developed applications using the API and Python 2 then you might be getting a bit worried right about now. The compatibility for both Python 2 and 3 will be kept for most things, except for the client:

This will be compatible with both 2 and 3:

  • The Crazyflie Python API (everything in lib/cflib)
  • The examples for the Crazyflie Python API (everything in examples)
  • The ZMQ server using the Crazyflie Python API (bin/cfzmq)
  • The Crazyflie command-line bootloader (bin/cfloader)

But the main clients will only have Python 3 compatibility:

  • The Crazyflie Python client (bin/cfclient)
  • The Crazyflie Python headless client (bin/cfheadless)

API Examples

While doing the porting we’ve also added more examples to cover more of the Crazyflie Python API. In order to keep 2/3 compatibility for the API it’s important to be able to test it easily with the different versions. We are having unit-tests on the TODO-list, but until then we’ve been using the API examples to test. All the examples should run with both Python 2 and 3. It’s also a good thing with more examples showing how to use the API…

Porting and compatibility

The approach we used was to first run the 2to3 utility to automatically to as much as possible of the porting. After that we had to fix the rest of the errors manually and also maintain the dual 2/3 compatibility of the API.

In our previous implementation we made use of strings to store binary data that we were sending/receiving. But because of incompatibilities between Python 2 and 3 this didn’t fit very well. To make things neat for the API we found a container where we could store bytes that works with both Python 2 and 3, the bytearray. Even though we use the same type, there’s still some subtle differences in usage between the versions. After doing some testing we found ways where the syntax was the same for Python 2/3.

First of all bytearrays can be created from a string, tuple or list. When indexed by the [] operator it will give you the value of each byte.

>>> d = bytearray([i for i in range(10)])
>>> d
bytearray(b'\x00\x01\x02\x03\x04\x05\x06\x07\x08\t')
>>> d[5]
5

The main point is getting something meaningful out of the bytearray when doing the communication, here’s a few examples:

Unpacking a byte, an integer and a word from the first 7 bytes (little endian)

>>> struct.unpack("<BIH", a[:7])
(0, 67305985, 1541)

Getting a string from a subset of the data can be done by using decode and the char-set to use for decoding. We use ISO-8859-1 since the Crazyflie does not support Unicode (yet?).

>>> d = bytearray([i for i in range(97,100)])
>>> d
bytearray(b'abc')
>>> d.decode("ISO-8859-1")
'abc'

You can also easily get a tuple or a list:

>>> list(d)
[97, 98, 99]
>>> tuple(d)
(97, 98, 99)

And you can also concatenate:

>>> d + d
bytearray(b'abcabc')

And find a byte:

>>> d.find(bytearray((98, )))
1

But there’s also a few things we couldn’t get to work in a good way and have to check which version we’re running and execute different code, like the queue import that has changed name.

if sys.version_info < (3,):
    import Queue as queue
else:
    import queue

Another problem we haven’t solved is creating a bytearray from a string, so it’s also

if sys.version_info < (3,):
    self._data = bytearray(data)
else:
    self._data = bytearray(data.encode('ISO-8859-1'))

As for the client code that was ported to Python 3 without keeping the backwards compatibility there wasn’t any big issues. The biggest change was the PyQT4 API where there’s a few things that have improved when placing custom Python data in GUI objects. Before QVariant was used for this. You would create a QVariant object that wrapped the Python object. To get data out from the QVariant again you would have to explicitly say what type it had by calling the correct function (like toInt()). Now this is a lot smoother. QVariant has been skipped and you just use the Python type directly.

For more information have a look here where we found a lot of useful tips. Don’t hesitate to leave a comment if you think we could have done things differently or if you have any tips!

What’s not working

There’s still a few things we’re not sure how to fix and we have to look into it a bit more. These are:

  • There doesn’t seem to be any Python 3 bindings for the Leap Motion. According to this it’s possible to build the bindings yourself.
  • The Python 3 bindings to Marble for the GPS tab hasn’t been investigated yet

PEP-8

On a side note we’ve started using Travis CI (more on this next week) and will start creating unit-tests for the Crazyflie Python  API. As a first step we’re running PEP-8 on all the code. This will be checked automatically for all commits and pull-requests.

 

This weekend we went to the maker weekend at Hx in Helsingborg and showed off the Crazyflie 2.0 flying with the Kinect. It’s an awesome demo for fairs since it flies by itself and looks pretty good. Below is a few photos from the event.

But this time we ran into some issues with the set-up. When we first developed this we were running the image acquisition and processing on Linux using libfreenect2, but we later switched to Windows. The reason is these three lines. These lines map the depth measurement to the camera measurements and gives a set of “world-coordinates”. This is needed, since the distances left/right wouldn’t be correct without taking the distance away from the camera into consideration. Without it a left/right movement close to the camera would give a much larger response than one further away from the camera.

So to solve the issue above we moved everything to Windows, which would kind of solve the issue. But we started experiencing lag in the regulation, which originated from lag in the image processing. After doing some more digging we drew the conclusion that there was a USB issue when using the Crazyradio at the same time as the Kinect v2. Once every couple of minutes the FPS for the video will drop really low (which results in CPU usage going down as well). But disconnecting the Crazyflie (i.e not using the Crazyradio) seems to solve this issue. To work around this problem we currently have a set-up where we use two laptops. Since we’re running ZMQ to communicate between the applications it’s a quick operation to split it up on multiple hosts. So the Windows laptop runs image acquisition and processing and the Linux laptop runs the control-loop and Crazyflie client for sending the commands to the Crazyflie.

When we were first developing this there was lost of things happening in the libfrenect2 repo, so this might be implemented now. Does anyone have any tips for this? We would love to be able to run the system fully on Linux with only one laptop :-)

During the upcoming months we will be attending both the New York Maker Faire and the Berlin Maker Faire, hanging around the Seeedstudio booth. So if you want to see the Crazyflie/Kinect demo live or just to hang out and talk to us then drop by!

MF15NY_Badge1 icon_Berlin

A new version of the Crazyflie PC Python client has been released, version 2015.08. It’s been a while since the last release of the client so there’s a long list of changes, including lots of fixed bugs. The main new features are:

  • Student/Teacher mode:  It’s possible to use two input devices, where one can take over control from the other. This can be used for teaching or for working with a computer auto-pilot (doc)
  • Control the LED-ring from the Flight Tab:  Now it’s possible to turn on/off the headlights directly with a click and to select the LED-ring effect from a drop-down (doc)
  • New LED-tab to set custom patterns and intensity: Enables the user to individually set the color of each LED as well as the intensity of the LED-ring
  • ZMQ access to LED-ring memory and parameters: Write patterns for the LED-ring or set/get parameter values from an external application using JSON and ZMQ (doc)
  • ZMQ input device: Simulate a joystick by sending axis values in JSON via ZMQ. This can be used to implement a computer auto-pilot using for instance the Kinect (doc)
  • Switched from PyGame to PySDL2 on Mac OSX/Windows and native input device on Linux
  • WiiMote support

The next step after the release is to shape up the code a bit, so we’ve started using Travis for building and continuous integration. The long term goal is to run Flake8 and unit-tests on the code, but we still have a bit to go. The way we’re working towards this is by slowly enabling more and more checking in Travis, fixing one type of errors at a time.