Now that we are all back from our summer holiday, we are back on what we were set on doing a while ago already: fixing issues and stabilizing code. In the last two weeks we have been focusing on fixing existing issues of the Flowdeck and LPS positioning system. It is still work in progress and even though we fixed some problems, we still have some way to go! At least we can give you an update of our work of the last few weeks.
Flow-deck Kalman Improvements
When we started working on the motion commander tutorials (see this blogpost), which are mostly based on flying with the flowdeck, we were also hit by the following error that probably many of you know: the Crazyflie flies over a low texture area, wobbles, flips and crashes. This won’t happen as long as you are flying of high texture areas (like a children’s play mat for instance), but the occasional situation that it is not, it should not crash like it does now. The expected behavior of the Crazyflie should be that it glides away until it flies over something with sufficient texture again (That is the behavior that you see if when you are flying manually with a controller, and you just let the controls go). So we decided to investigate this further.
First we thought that it might had something to do with the rotation compensation by the gyroscopes, which is part of the measurement model of the flowdeck, since maybe it was overcompensating or something like that. But if you remove that parts, it starts wobbling right away, even with high texture areas… so that was not it for sure… Even though we still think that it causes the actual wobbling itself (compensating flow that is not detected) but we still had to dig a bit deeper into the issue.
Eventually we did a couple of measurements. We let the Crazyflie fly over a low and high texture area while flying an 8 shape and log a couple of important values. These were the detected flow, the ground truth position, and a couple of quality measurements that the Pixart’s PMW3901 flow sensor provided themselves, namely the amount of features (motion.squal) and the automatic shutter time (motion.shutter). With the ground truth position we can transform that to the ground truth flow that the flowdeck is supposed to measure. With that we can see what the standard deviation of the measurement vs groundtruth flow is supposed to be, and see if we can find a relation the error’s STD and the quality values, which resulted in these couple of nice graphs:
Three major improvements were added to the code based on these results:
The standard deviation is the flow measurement is increased from 0.25 to 2.0 pixels, since this is actually a more accurate depiction of the measurement noise to be expected by the Kalman filter
An adaptive std based on the motion.shutter has been implemented (since there is a stronger correlation there than with motion.squal), which can activated putting the parameter motion.adaptive to True or 1. Its put by default on False or 0 since the heightened STD of the first improvement already increased the quality of flight significantly.
If the flow sensor indicates there is no motion detected (log motion.motion), it will now prevent to send any measurement value to the Kalman filter. Also it will adjust the difference in time (dt) between samples based on the last measurement received.
Now when the Crazyflie flies over low texture areas with the Flowdeck alone, it will not flip anymore but simply glide away! Check out this closed issue to know more about the exact implementation and it should be part of the next release.
The LPS and Flowdeck
Kalman filter conflicts
The previous fix of the flow deck also took care of this issue, which caused the Crazyflie to also flip in the LPS system if it does not detect any flow.. This happened because the Kalman filter trusted the Flow measurement much more than the UWB distance measurement in the previous firmware version, but not anymore! If the Flowdeck is out of range or can’t detect motion, the state estimation will trust the LPS system more. However, once the Flowdeck is detecting motion, it will help out with the accuracy of the positioning estimate.
Moreover, now it is possible to make the Crazyflie fly in and out of the LPS system area with the Flowdeck! however, be sure that it flies using velocity commands, since there are situations where the position estimate can skip:
1- LPS system is off 2- take off Crazyflie with only Flowdeck, 3 – turn on the LPS nodes
1- Take off in LPS, 2- fly out of LPS system’s reach for a while (position estimate will drift a bit) 3- Fly back into the LPS system with position estimation drift due to Flowdeck.
As long as your are flying with velocity commands, like with the assist modes with the controller in the CFclient, this should not be a problem.
Deck compatibility problems
The previous fixes only work with the LPS methods TDOA2 and TDOA3. Unfortunately, there is still some work to be done with the Deck incompatibility with the TWR method and the Flowdeck. The deck stops working quickly after the Crazyflie is turned on and this seems to be related to the SPI bus that is shared by the LPS deck and the flowdeck. Reading the flow sensor takes some time, which blocks the TWR algorithm for a while, making it miss an event. Since the TWR algorithm relies on a continues stream of events from the DWM1000 chip, it simply stops working if it does not … or at least that is our current theory …
Please check out this issue to follow the ongoing discussion. If you have maybe an idea of what is going on, drop a comment and see if we can work together to iron out this issue once and for all!
Hello everyone, this is Victor and I’ve spent another internship here at Bitcraze. You can read my blogpost from last year here https://www.bitcraze.io/2019/08/summer-internship/. I have learned a lot since last year so it has been fun to put my skills to test!
This summer I have spent my time on improving the cf-client. I’ve fixed a few bugs but mostly it has been about making small changes that improves overall functionality.
Some of the improvements that I’ve worked on:
Flight-control tab: Added logging of x and y-axis and changed the columns to more suitable groups. Also improved UI for assist mode.
Flashing dialog: Added support to flash both of the MCUs individually as well as choosing which one to flash (previously you could only flash the stm32 or both). Created an automatic firmware-release-downloader, so that you don’t have to download the files manually.
Log-config: Added grouping of log-configurations, that allows you to group the configs into categories. Also added small functionalities like double-click to add, remove configs etc.
Added sort-support for all list/tables.
Removed traces from Crazyflie 1.0 and support for x-mode, since it is no longer supported by the client.
Removed traces from python 2.
I hope that the functionalities will help you and make your experience with the client better. If you have any tips for further improvements, you’re more than welcome to leave a comment or contribute yourself. This is my last week for this summer, but I hope to see you all again and until then, fly safe!
As you probably already know we have been wondering how to best handle our documentation and how to provide information to new Crazyflie starters as easy as possible, as you can read in our blogpost of two weeks ago. In the mean time, we also had a chance to think about the results of a poll we had when we discussed about new ways on how to meet our users. We had about 30 responses, but it became clear that many of you are in the need of getting some more knowledge about working with the Crazyflie. The majority voted for online tutorials, and although it might be difficult to do those during the summer holidays, we already started to make step-by-step guides of various parts of the Crazyflie Eco-system.
Python Library Tutorials
Currently we have started with step-by-step tutorials of the CFLIB (the python library of the crazyflie). Usually we refer to the example pages of the CFLIB, however we feel that many users copy paste parts of these scripts for their own purposes, without understanding what is actually going on. Therefore, in order to move Crazyflie beginners to the starting developer phase, we have made these guides in order to teach exactly what is going on in each module, step-by-step.
These tutorials can be found in the python library documentation. The first tutorial focuses on connecting, logging and parameters, which guides you through the process of connecting the Crazyflie through a python script, starting up logging configurations in two ways (asynchronous and synchronous) and how to read and set parameters.
The second tutorial is about the motion commander and is a logical continuation of the first. A nice thing is that we also show how to build in some protection in your script as well. If the Flowdeck is not attached, you can check that and prevent the Crazyflie from taking off altogether if it does not detect the Flowdeck. This will be a life saver in your future endeavors, and trust me I know from experience ;). Afterwards it will go into how to take off – fly forward and go back to the initial position. The application of the end of the motion commander tutorial, is where we also use the logging functionality to get the actual estimated position of the Crazyflie. With this information, we show how to write an application that create a virtual bounding box where the Crazyflie can bounce around in (like the old windows screensaver).
We are planning to finish this step-by-step guide by adding the multi-ranger to the mix, continuing on the bouncing ball example. After that we will probably start some tutorials on how to use the swarming functionality before moving on to the firmware or the client.
Work in Progress
The tutorials are still work in progress. So let us know on the forum, python library github repository or as a comment on this blogpost if you see anything wrong or if something is not very clear. This will improve the quality further so that other users can benefit as well. Also once the these step-by-step tutorials are finished we can start working on video based tutorials as well.
Remember, it is possible to contribute your own fixes (or tutorials) to our repositories if you want to. It’s an open source project after all ;)
Modular robotics implies in general flexibility and versatility to robots. In theory, you could design a modular robot basically on the way you would want it to be, by simply adding or removing modules from the already existing robot. Changing the robot configuration by adding more individuals, generally increases the system redundancy, meaning that now, there are probably many different ways to achieve a specific goal. From a naive standard point of view, more modules could imply in practice more robustness due to this redundancy. In fact, it does get more robust by the cost of becoming more complex, and probably harder to control. Added to that, other issues may arise when you take into account that your modular robot is flying, and how physical properties and actuation scales as the number of modules grow.
In the GRASP Laboratory at the University of Pennsylvania, one of our focuses is to allow robots to achieve a specific task. In this work, we present ModQuad-DoF, which is a modular flying platform that enlarges the configuration space of modular flying structures based on quadrotors (crazyflies), by applying a new yaw actuation method that relies on the desired roll angles of each flying vehicle. This research project is coordinated by Professor Mark Yim , and led by Bruno Gabrich (PhD candidate).
Scaling Modular Robots
Scaling modular robots is a very challenging problem that usually limits the benefits of modularity. The sum of the performance metrics (speed, torque, precision etc.) from each module usually does not scale at the same rate as the conglomerate physical properties. In particular, for ModQuad, saturation from individual motors would increase as the structures became larger leading to failure and instability. When conglomerate systems scale up in the number of modules, the moment of inertia of the conglomerate often grows faster than the increase in thrust capability for each module. For example, the increase in the moment of inertia for a fifth module added to four modules in a line can be approximated by the mass of the module times half the distance to the center squared. This quadratic increase gives us the intuition that the required yaw actuation grows faster than the actuation authority.
Yaw Actuation
An inherit characteristic of quadrotors is to have their yaw controlled by the drag moments from each propeller. For ModQuad as more modules are docked together, a decreased controllability in yaw is noticed as the structure becomes larger. In a line configuration the structure’s inertia grows quadratically with the distance of each module to the structure center of mass. On the other hand the drag moments produced scales linearly with the number of modules.
The new yaw actuation method relies on the fact that each quadrotor is capable to generate an individual roll enabled by our new cage design. By working in coordinated manner, each crazyflie can then generate structure moments from moment arms provided by the propellers given its roll and its distance from the structure’s center of mass.
Cage Design
The Crazyflie 2.0 is the chosen platform to enable thrust and attitude to the individual modules. The flying vehicle measures 92×92×29 mm and weights 27 g while its battery lasts around 4 minutes for the novel design proposed. In this work the cage performs as pendulum relative to the flying vehicle. The quadrotor is joined to the cage through a one DOF joint. The cages are made of light-weight materials: ABS for the 3-D printed connectors and joints, and carbon fiber for the rods.
Although the flying vehicle does not necessarily share same orientation as the cage, the multiple connected cages do preserve same orientation relative to each other. With the purpose of allowing such behavior, we used Neodymium Iron Boron (NdFeB) magnets as passive actuators to enable rigid cage connections. Docking is only allowed at the back and front face of the modules, and each one of these faces contains four magnets. Those passive actuators have dimensions of 6.35 × 6.35 × 0.79 mm with a bonding force of 1 kg.
Structure Flying Performance
Conclusions
ModQuad-DoF is a flying modular robotic structure whose yaw actuation scales with increased numbers of modules. ModQuad-DoF has a one DOF jointed cage design and a novel control method for the flying structure. Our new yaw actuation method was validated conducting experiments for hovering conditions. We were able to perform two, four and, six modules cooperatively flying in a line with yaw controllability and reduced loss in thrust. In future work we aim to explore the structure controllability with more robots in a line configuration, and exploring different solutions for the desired roll angles. Possibly, with more modules in the structure, only a few would be required to roll in order to maintain a desired structure yaw. Given that, we could explore the control allocation for each module in a specific structure configuration, and dependent on its desired behavior. Further, structures that are not constrained to a line will also be tested using the basis of the controller proposed in this work.
It is apparently a recurrent theme within Bitcraze:New people come into the office, claim that the documentation is a bit of a mess, then will make it their personal mission on the company to try to fix it (because ‘how hard can it be?’) and come close to a mini depression when it turns out that it ain’t so easy at all.
And yes, I absolutely fell in into that trap too. During my PhD I did not really work on documentation like this (with the exception of papers) so I made quite ambitious plans last year as you can read in this blogpost. We managed to already cross a couple of things off: we moved wiki pages to the github and host it on our website and created datasheets for products, which should make it possible to close the wiki product pages.
However, we still haven not managed to completely close off the wiki because some pages can not really be split up or might have information on there that might not be very future proof. But there are definitely many matters to improve, so we are just writing some of our thoughts down.
Beginner – ? – Developer
One of the things that we noticed that is missing, also by comments of you guys on the forum or by mail, are the means to bring the crazyflie starters quickly to the developer phase. There are some tutorials to be found on our website, but the general feeling is that it does not elevate the general understanding of how everything works. Even the tutorials that cover the autonomous flight with a flowdeck does not go further than giving install instructions and handing over the full python script while not explaining which element does what.
Of course, there are already user manuals to be found in the github docs, however those are maybe too big of a step and take much for granted that the reader knows every ‘in between’ step. It would be much better, for any level, to have step-by-step guides on how to set thing up and what each element’s role is in the code. That would probably work much more effectively as a start for beginning developers.
So we had some tutorials in mind that can elevate first starters to come closer to the developer page:
CFCLIENT: How to working with the logging / parameter framework and the plotting tab
CFCLIENT: How to interpret the debug console output
CFLIB: How to connect to the Crazyflie and read out logs and parameters
CFLIB: How to send set points and the commander framework
CFLIB: How to build up the Multiranger push demo step by step
CF FIRMWARE: How to work with the Applayer (adding own modules or code)
If there are more tutorials that you would like to see, please let us know!
Doc closer to the code
The consensus here in general, is that we would like to have the documentation as close as possible to the doc. At least we have taken a step into the right direction by importing the docs into the Github Repos. This means that with every new feature added, the person responsible can add documentation to it directly in the same commit/pull request.
However, if the description is part of function’s or classes doc strings, it is as close as it can get! The contributor does not need to change the separate markdown file but can change the information directly. Moreover, it can also auto generate documentation for us, as you can see here from one of our try-outs with sphinx and our crazyflie-lib-python repo:
Maybe for a beginner such documentation would not be great as a start, but for a more trained developer this could be very useful. My personal problem with most automatic generated documentation is that I find them difficult to read and find the functions that I need. However it would be possible to change the layout to make it a bit more readable since we will host it on our website. And since we mostly use C and Python in our repos, the most logical tools would be Doxygen and Sphinx. There are probably other possibilities out there, but if we would like to integrate this in our framework, we would like to go with tools that are future proof.
The whole picture
The problem with Autodoc is that it mostly shows the itty-bitty-gritty details of a library or firmware, however the users tend to get lost and can not see the whole picture. Also we are maintaining a lot of libraries and firmwares to consider (as you can see here in this list) based on which hardware they are applied for. This means that we have separate documentation pages on almost all of those.
And then comes the decision of where to place information. For instance, the CRTP (Crazy RealTime Protocol) is documented in the crazyflie firmware documentation, since there it is indeed how it is implemented, but CRTP does not only affect the crazyflie firmware. It goes from the STM32F4 to the NRF to the crazyradio through the USB on your computer through the cflib which is the backbone of the CFclient. This is an topic that users would like to have to an overview from if they would like to develop something with the CRTP.
Step-by-step guides are maybe still too detailed to explain the whole picture so maybe we should have some other way to have this overview shown. Maybe by an online lecture or a more lesson type of medium?
There is a lot of sources out there (like write the doc) that have tips on how to maintain the information sources for users, but of course we need to have a documentation structure that is useful and readable for many types of users and maintainable from our side. Let us know if you want to share any insight from your own experiences!
It has been about a month since the AI-deck became available in Early Access. Since then there are now quite a few of you that own an AI-deck yourself. A new development we would like to share: we thought before that we had selected a gray-scale image sensor. However, it came to our attention that the camera actually contains a color image sensor, which on second viewing of the video presented in this blogpost is pretty obvious in hindsight (thanks PULP project ETH Zurich for letting us know!).
This came as a little surprise, but a color camera can also add some new possibilities, like making the Crazyflie follow a orange ball, or also train the CNNs incorporate color in their classification training as well. The only thing is that it will require an extra preprocessing task in order to retrieve the color image, which will be explained in the next section.
Demosaicing
Essentially all CMOS image sensors are gray-scale by definition. In order to retrieve color from a scene, manufacturers add a Bayer filter on top of the image sensor, so it filters out the red, green and blue on each pixels. This Color filter array does not need to be RGB, but all kinds of colors, but we will only talk about the Bayer filter. If the pattern of the filter is known, the pixels that related to a certain color will be interpolated with each-other in order to fill in the gaps in between. This process is called demosaicing and it creates the RGB channels that are converted to a color image.
Currently we only implemented a simple nearest-neighbor interpolation scheme for demosaicing, which is fine for demonstration purposes, however is not the best technique out there. Such a simple interpolation is not very ‘edge and detail’ aware and can therefore cause artifacts, like these Moiré effects seen here below. Anyway, we are still experimenting how to get a better image and how to translate that to all the examples of the AI-deck example repository (see this issue if you would like to follow or take part in the discussion).
So technically, once we have the color image, this can be converted to a gray-scale images which can be used for the examples as is. However, there is a reduction in quality since the full pixel resolution is not used for obtaining the full scale image. We are currently discussing if it would be useful to get the gray-scale version of this camera and make this available as well, so let us know if you would be interested!
Feedback and Early Access
Like we said before, there now quite a few of you out there that have an AI-deck in their procession. As it is in Early Access, the software part is still in full development. However, since we have not received any negative feedback of you, we believe that everything is fine and peachy!
Just kidding ;) we know that the AI-deck is quite a challenging deck to work with and we know for sure that many of you probably have questions or have something to say about working with it. Buying an Early Access product also comes with a little bit of responsibility. The more feedback we get from you guys, the more we can tailor the software and support to help you and others, thereby advancing the product forward and getting it out of the early access phase.
So please, let us know if you are having any trouble starting up by posting a thread on the forum (we have a special AI-deck group!). If there are any issues with the examples or the documentation of the AI-deck repo. We and also our collaborators at Greenwaves Technologies (from the GAP8 chip) are more than happy to help out. That is what we are here for :)
The Crazyflie 2.1 was the perfect robotics platform for an introduction to autonomous robotics at the University of Washington winter quarter 2020. Our Bio-inspired Robotics graduate course completed a series of Crazyflie projects throughout the 10 weeks that built our skills in:
Python
Robot Operating System (ROS)
assembling custom sensors
writing new drivers
designing and testing control algorithms
trouble shooting and independent learning
The course was offered by UW Mechanical Engineering’s Autonomous Insect Robotics Laboratory, headed by Dr. Sawyer B. Fuller. The course was supported by PhD candidate Melanie Anderson, who has done fantastic research with her Crazyflie-based Smellicopter. The final project was an opportunity to turn a Crazyflie quadcopter into a bio-inspired autonomous robot. Our three person team of UW robotics grad students included Nishant Elkunchwar, Krishna Balasubramanian, and Jessica Noe.
Light Seeking Run-and-Tumble Algorithm Inspired by Bacterial Chemotaxis
The goal for our team’s Crazyflie was to seek and identify a light source. We chose a run-and-tumble algorithm inspired by bacterial chemotaxis. For a quick explanation of bacterial chemotaxis, please see Andrea Schmidt’s explanation of chemotaxis on Dr. Mehran Kardar’s MIT teaching page. She provides a helpful animation here.
In both bacterial chemotaxis and our run-and-tumble algorithm, there is a body (the bacteria or the robot) that can:
move under its own power.
detect the magnitude of something in the environment (e.g. chemical put off by a food source or light intensity).
determine whether the magnitude is greater or less than it was a short time before.
This method works best if the environment contains a strong gradient from low concentration to high concentration that the bacteria or robot can follow towards a high concentration source.
The details of the run-and-tumble algorithm are shown in a finite state machine diagram below. The simple summary is that the Crazyflie takes off, begins moving forward, and if the light intensity is getting larger it continues to “Run” in the same direction. If the light intensity is getting smaller, it will “Tumble” to a random direction. Additional layers of decision making are included to determine if the Crazyflie must “Avoid Obstacle”, or if the source has been reached and the Crazyflie quadcopter should “Stop”.
Crazyflie Hardware
To implement the run-and-tumble algorithm autonomously on the Crazyflie, we needed a Crazyflie quadcopter and these additional sensors:
Bitcraze Prototype deck with BH-1750 light intensity sensor
The Optic Flow deck was a key sensor in achieving autonomous flight. This sensor package determines the Crazyflie’s height above the surface and tracks its horizontal motion from the starting position along the x-direction and y-direction coordinates. With the Optic Flow installed, the Crazyflie is capable of autonomously maintaining a constant height above the surface. It can also move forward, back, left, and right a set distance or at a set speed. Several other pre-programmed movement behaviors can also be chosen. This Bitcraze blog post has more information on how the Flow deck works and this post by Chuan-en Lin on Nanonets.com provides more in-depth information if you would like to read more.
The Bitcraze Multi-ranger deck provided the sensor data for obstacle avoidance. The Multi-ranger detects the distance from the Crazyflie to the nearest object in five directions: forward, backward, right, left, and above. Our threshold to trigger the “Avoid Obstacle” behavior is detecting an obstacle within 0.5 meters of the Crazyflie quadcopter.
The Prototype deck was a quick, simple way to connect the BH-1750 light intensity sensor to the pins of the Crazyflie to physically integrate the sensor with the quadcopter hardware. This diagram shows how the header positions connect to the rows of pads in the center of the deck. We soldered a header into the center of the deck, then soldered connections between the pads to form continuous connections from our header pin to the correct Crazyflie header pin on the left or right edges of the Prototype deck. The Bitcraze Wiki provides a pin map for the Crazyflie quadcopter and information about the power supply pins. A nice overview of the BH-1750 sensor is found on Components101.com, this shows the pin map and the 4.7 kOhm pull-up resistor that needs to be placed on the I2C line.
It was easy to connect the decks to the Crazyflie because Bitcraze clearly marks “Front”, “Up” and “Down” to help you orient each deck relative to the Crazyflie. See the Bitcraze documentation on expansion decks for more details. Once the decks are properly attached, the Crazyflie can automatically detect that the Flow and Multi-Ranger decks are installed, and all of the built-in functions related to these decks are immediately available for use without reflashing the Crazyflie with updated firmware. (We appreciated this awesome feature!)
Crazyflie Firmware and ROS Control Software
Bitcraze provides a downloadable virtual machine (VM) to help users quickly start developing their own code for the Crazyflie. Our team used a VM that was modified by UW graduate students Melanie Anderson and Joseph Sullivan to make it easier to write ROS control code in the Python coding language to control one or more Crazyflie quadcopters. This was helpful to our team because we were all familiar with Python from previous work. The standard Bitcraze VM is available on Bitcraze’s Github page. The Modified VM constructed by Joseph and Melanie is available through Melanie’s Github page. Available on Joseph’s Github page is the “rospy_crazyflie” code that can be combined with existing installs of ROS and Bitcraze’s Python API if users do not want to use the VM options.
“crazyflie-firmware” – a set of files written in C that can be uploaded to the Crazyflie quadcopter to overwrite the default firmware
In the Bitcraze VM, this folder is located at “/home/bitcraze/projects/crazyflie-firmware”
In the Modified VM, this folder is located at “Home/crazyflie-firmware”
“crazyflie-lib-python” (in the Bitcraze VM) or “rospy_crazyflie” (in the Modified VM) – a set of ROS files that allows high-level control of the quadcopter’s actions
In the Bitcraze VM, “crazyflie-lib-python” is located at “/home/bitcraze/projects/crazyflie-lib-python”
In the Modified VM, navigate to “Home/catkin_ws/src” which contains two main sets of files:
“Home/catkin_ws/src/crazyflie-lib-python” – a copy of the Bitcraze “crazyflie-lib-python”
“Home/catkin_ws/src/rospy_crazyflie” – the modified version of “crazyflie-lib-python” that includes additional ROS and Python functionality, and example scripts created by Joseph and Melanie
In the Modified VM, we edited the “crazyflie-firmware” files to include code for our light intensity sensor, and we edited “rospy-crazyflie” to add functions to the ROS software that runs on the Crazyflie. Having the VM environment saved our team a huge amount of time and frustration – we did not have to download a basic virtual machine, then update software versions, find libraries, and track down fixes for incompatible software. We could just start writing new code for the Crazyflie.
The Modified VM for the Crazyflie takes advantage of the Robot Operating System (ROS) architecture. The example script provided within the Modified VM helped us quickly become familiar with basic ROS concepts like nodes, topics, message types, publishing, and subscribing. We were able to understand and write our own nodes that published information to different topics and write nodes that subscribed to the topics to receive and use the information to control the Crazyflie.
A major challenge of our project was writing a new driver that could be added to the Crazyflie firmware to tell the Crazyflie system that we had connected an additional sensor to the Crazyflie’s I2C bus. Our team referenced open-source Arduino drivers to understand how the BH-1750 connects to an Arduino I2C bus. We also looked at the open-source drivers written by Bitcraze for the Multi-ranger deck to see how it connects to the Crazyflie I2C bus. By looking at all of these open-source examples and studying how to use I2C communication protocols, our team member Nishant Elkunchwar was able to write a driver that allowed the Crazyflie to recognize the BH-1750 signal and convert it to a sensor value to be used within the Crazyflie’s ROS-based operating system. That driver is available on Nishant’s Github. The driver needed to be placed into the appropriate folder: “…\crazyflie-firmware\src\deck\drivers\src”.
The second change to the crazyflie-firmware is to add a “config.mk” file in the folder “…\crazyflie-firmware\tools\make”. Information about the “config.mk” file is available in the Bitcraze documentation on configuring the build.
The final change to the crazyflie-firmware is to update the make file “MakeFile” in the location “…\crazyflie-firmware”. The “MakeFile” changes include adding one line to the section “# Deck API” and two lines to the section “# Decks”. Information about compiling the MakeFile is available in the Bitcraze documentation about flashing the quadcopter.
Making additions to the ROS control architecture
The ROS control architecture includes messages. We needed to define 3 new types of messages for our new ROS control files. In the folder “…\catkin_ws\src\rospy_crazyflie\msg\msg” we added one file for each new message type. We also updated “CMakeLists.txt” to add the name of our message files in the section “add_message_files( )”.
The second part of our ROS control was a set of scripts written in Python. These included our run-and-tumble algorithm control code, publisher scripts, and a plotter script. These are all available in the project’s Github.
Characterizing the Light Sensor
At this point, the light intensity sensor was successfully integrated into the Crazyflie quadcopter. The new code was written and the Crazyflie quadcopter was reflashed with new firmware. We had completed our initial trouble shooting and the next step was to characterize the light intensity in our experimental setup.
This characterization was done by flying the Crazyflie at a fixed distance above the floor in tightly spaced rows along the x and y horizontal directions. The resulting plot (below) shows that the light intensity increases exponentially as the Crazyflie moves towards the light source.
The light characterization allowed us to determine an intensity threshold that will only happen near the light source. If this threshold is met, the algorithm’s “Stop” action is triggered, and the Crazyflie lands.
Testing the Run-and-Tumble Algorithm
With the light intensity characterization complete, we were able to test and revise our run-and-tumble algorithm. At each loop of the algorithm, one of the four actions is chosen: “Run”, “Tumble”, “Avoid Obstacle”, or “Stop”. The plot below shows a typical path with the action that was taken at each loop iteration.
Flight Tests of the Run-and-Tumble Algorithm
In final testing, we performed 4 trial runs with 100% success locating the light source. Our test area was approximately 100 square feet, included 1 light source, and 2 obstacles. The average search time was 1:41 seconds.
Lessons Learned
This was one of the best courses I’ve taken at the University of Washington. It was one of the first classes where a robot could be incorporated, and playing with the Crazyflie was pure fun. Another positive aspect was that the course had the feel of a boot camp for learning how to build, control, test, and improve autonomous robots. This was only possible because Bitcraze’s small, indoor quadcopter with optic flow capability made it possible to safely operate several quadcopters simultaneously in our small classroom as we learned.
This development project was really interesting (aka difficult…) and we went down a few rabbit holes as we tried to level up our knowledge and skills. Our prior experience with Python helped us read the custom example scripts provided in our course for the ROS control program, but we had quite a bit to learn about the ROS architecture before we could write our own control scripts.
Nishant made an extensive study of I2C protocols as he wrote the new driver for the BH-1750 sensor. One of the biggest lessons I learned in this project was that writing drivers to integrate a sensor to a microcontroller is hard. By contrast, using the Bitcraze decks was so easy it almost felt like cheating. (In the nicest way!)
On the hardware side, the one big problem we encountered during development was accidentally breaking the 0.5 mm headers on the Crazyflie quadcopter and the decks. The male headers were not long enough to extend from the Flow deck all the way up through the Prototype deck at the top, so we tried to solder extensions onto the pins. Unfortunately, I did not check the Bitcraze pin width and I just soldered on the pins we all had in our tool kits: the 0.1 inch (2.54 mm) wide pins that we use with our Arduinos and BeagleBones. These too-large-pins damaged the female headers on the decks, and we lost connectivity on those pins. Fortunately, we were able to repair our decks by soldering on replacement female headers from the Bitcraze store. I wish now that the long pin headers were available back then.
In summary, this course was an inspiring experience and helped our team learn a lot in a very short time. After ten weeks working with the Crazyflie, I can strongly recommend the Crazyflie for robotics classes and boot camps.
It has been a few months of when the Covid-19 crisis started, but it feels like almost a year ago when we all decided to stay and work from home. Considering circumstances, we managed to do to handle ourselves pretty well. We set up our home labs in our kitchens and/or living-rooms and managed to do a lot of development. Even though this situation did not come easy as you can see from our experiences here, we were able to pull ourselves through it in one peace. Now we also have to consider that Covid-19 is here to stay and we need to deal with the complications until at least the vaccine is finished and distributed. Until then, we might have to think about alternatives on how we do things, including how we go to events and meet/talk to you all!
Every year we try to go to at least two conferences, with last year being a particular busy year of us going to three big events (ICRA 2019, IMAV 2019 and IROS 2019). Before going to those conferences, we usually try to crunch and make an awesome demo. This also enables us to add new features to the firmware or fix problems that we find during this crunch. Moreover, we also really like to meet our users face-to-face, so that we can hear about how you use the Crazyflie in your research or classroom!
Since going to conferences and in-person events will be difficult to do this year and maybe the next, we were thinking about events that we can possibly organize to as an alternative. We were thinking about a couple of options on which we would like your opinion on as well. For instance, we could do an remote tutorial or lecture, like we did here for EPFL. Or maybe we can organize an online seminar we were invite users to give a talk about their work (I personally took part in a VR seminar in Mozilla hubs, which was pretty awesome). We can also consider to invite users for an online meetup to talk about the direction of the Crazyflie and its firmware. Another idea that we had recently, is to organize an online Crazyflie competition, where users can control the Crazyflie remotely or upload custom firmware, so that it can fly autonomously through an obstacle field.
We set up a poll of these ideas, so we can know what you guys like best! Also please comment below if you have further ideas about this or start a thread on the forum!
The summer has reached Sweden but this summer is not an ordinary summer. Coming from a pandemic spring, and not really knowing what the world is going to be like next, is not a situation we are used to. The covid situation in Sweden is looking a lot better now but we still have to be cautious and travel around as little as possible. That mean we will have a “homecation” or “hemester” as we say in Swedish.
We will recharge our batteries and get new inspiration and will try to embrace the “homecation”. During this time the pace at the Bitcraze office will be a bit slower but we will continue to ship products, answer emails, help you in the forum etc as usual all summer. The remaining time will be spent mainly on cleaning up as we normally do during summer. This includes bug fixing, documentation improvements, finishing small things that never gets done, etc. To summarize, improvements in general.
The major event of this release is the use of the Core Coupled Memory in the Crazyflie. The CCM is a 64k RAM memory bank and by moving memory blocks from the standar RAM to the CCM, we have freed up 64k of RAM! The 128k of RAM was almost full so an extra 50% is good news.
One might ask why this has not been done earlier and the answer is that the CCM has some special properties that has to be taken into account. It is RAM, just like the “normal” RAM, but it is connected to a different internal bus in the STM MCU. The most notable difference is that it can not be used in DMA operations that are commonly used when accessing sensors, and if a pointer to CCM memory is passed to a sensor driver things will go bad. To make it clear where the memory is located, we have introduced a macro to be used when explicitly moving a memory area to the CCM, otherwise it will end up in normal RAM.
Hopefully the chosen design will have very little impact on the “normal” firmware programmer. We have moved a bunch of memory blocks to the CCM that are “safe”, and most programmers can happily forget about the CCM and just enjoy the new 64k of available RAM!
Battery temperature
In release 2020.02 we introduces a battery temperature check do not charge the battery if it is too warm. Lithium batteries likes to be charged within 0-45 deg Celsius. To do this we used the temp sensor within the nRF51822 which is mounted just under the battery. It hover turned out that the temp measurement is way to biased and as a result stops charging to early. So in this release we did more measurements and increased the allowed charging range.