crazyflie

Santa is soon to be knocking on the door, hopefully with one or two exciting toys (with blinking LEDs) for us geeky people! There will not be a Christmas video in the Bitcraze gift this year, instead we’re wrapping up a new release that we hope will add to the Christmas fun!

We have been working on a secret project though and there might be a video for next week’s blog post showing what we have been up to…

The 2022.12 release

We are happy to announce that a new official release is out, 2022.12! We have mainly fixed bugs and stability issues but also added some new features, please see details below.

Crazyflie STM firmware (2022.12)

One of the main events in this release is that the Flapper Nimble+ has got official support with the flapper platform, it can now be flashed through the client like any other member of the Crazyflie family. A new controller, based on work by Brescianini has been added. The Kalman estimator and Lighthouse system have been tweaked to work better with the increased data volumes generated with 2+ base stations. Some improvements for brushless motors have been added. Finally there have been some general bug and stability fixes, including improvements for flashing of the AI-deck.

Please see the release notes for a list of all changes.

Crazyflie NRF firmware (2022.12)

The NRF firmware release mainly contains changes to support the new STM firmware.

Please see the release notes for a list of all changes.

Crazyflie lib python (0.1.21)

A blocking method has been added to upload trajectories to the high level commander, the various Uploader classes in the examples are not needed anymore. Stability and bug fixes related to deck flashing.

Please see the release notes for a list of all changes.

Crazyflie python Client (2022.12)

A button has been added in the console log tab to get statistics about persistent storage in the Crazyflie. The final traces of Windows and Mac builds have been cleaned out and some stability and bug fixes have been applied.

Please see the release notes for a list of all changes.

Announcement: We will have a townhall meeting this Wednesday (7th of December) about Crazyradio 2.0 and the ideas about the new com-stack at 15:00 (3 pm) CET. Please follow the discussion here for more info.

As you have been very much aware of already if you have been reading the blog occasionally is that we went to Japan with the entire company to be at the International Conference on Intelligent Robots and Systems (IROS) in Kyoto, Japan. Besides eating great food, singing karaoke, and herding our fully onboard autonomous swarm at our stand, we also had some time to check out what kind of work was done with the Crazyflie in the proceeding papers and talks!

So just some generic statistics first:

  • IROS had 1716 papers accepted
  • We found 14 Crazyflie papers/posters and 2 workshop papers
  • The three biggest topics we found the papers in were: SLAM, Multi-robot systems and Navigation & Motion planning, SLAM

At ICRA this year, we noticed that the Crazyflie/bolt were used to make unconventional platforms, like a mono-copter or transforming the Crazyflie to a Pogo stick. It was interesting to see that now at IROS, the focus seemed to be more on navigation, localization and even SLAM… also with unconventional sensors!

Navigation and SLAM with the Crazyflie

In the summer I (Kim) worked on a summer project with using ROS2 to try SLAM with the standard packages with the Flow deck and Multi-ranger. This was also to present the work at ROScon before that with the Crazyswarm2 project, the Crazyflie can be used as an actual robotic platform too! I’m glad that some researchers already figured this one out already, as there were quite some papers on SLAM! [6] and [12] made use of the flow & multi-ranger but made their own custom algorithms to do SLAM and mapping that was more tailored to the task than the standard SLAM packages out there meant for 360 degree lidars.

Very interestingly, there were several papers that uses unconventional sensors for this as well. [5] used a gas sensor to do both gas source localization and distributing mapping and [10] made their own echolocation deck with buzzer + microphones. Let’s see what other sensors will be explored in the future!

Safe Robot Learning Competition

A special mention goes to the Safe Robot Learning competition, organized by the joined TU Munich and Utoronto’s the Learning system & robotics lab (formally known as the Dynamic Systems lab). In this competition, teams could participate with an online competition where they had to finish an obstacle course in simulation. From those that were successful, the finals were done with a real Crazyflie at a remote testbed in the University of Toronto, where the algorithms were put to the ultimate test! The simulation was done in the safe-control-gym framework [12], and the communication with the real Crazyflie was done with the ROS1 based Crazyswarm. We sponsored the first three places with a couple of Crazyflie bundles, so congrats to the winners!

List of IROS 2022 Papers featuring the Crazyflie

  1. Using Simulation Optimization to Improve Zero-shot Policy Transfer of Quadrotors Sven Gronauer, Matthias Kissel, Luca Sacchetto, Mathias Korte and Klaus Diepold
  2. Downwash-aware Control Allocation for Over-actuated UAV Platforms Yao Su , Chi Chu , Meng Wang , Jiarui Li , Liu Yang , Yixin Zhu , Hangxin Liu
    • Beijing Institute for General Artificial Intelligence (BIGAI)
    • ArXiv
    • IEEE Xplore
  3. Towards Specialized Hardware for Learning-based Visual Odometry on the Edge Siyuan Chen and Ken Mai
    • Beijing Institute for General Artificial Intelligence (BIGAI)
    • IEEE Xplore
  4. Polynomial Time Near-Time-Optimal Multi-Robot Path Planning in Three Dimensions with Applications to Large-Scale UAV Coordination Teng Guo, Siwei Feng and Jingjin Yu
  5. GaSLAM: An Algorithm for Simultaneous Gas Source Localization and Gas Distribution Mapping in 3D Chiara Ercolani, Lixuan Tang and Alcherio Martinoli
    • Ecole Polytechnique Federale de Lausanne (EPFL),
    • IEEE Xplore
  6. Efficient 2D Graph SLAM for Sparse Sensing Hanzhi Zhou, Zichao Hu, Sihang Liu and Samira Khan
  7. Avoiding Dynamic Obstacles with Real-time Motion Planning using Quadratic Programming for Varied Locomotion Modes Jason White, David Jay, Tianze Wang, and Christian Hubicki
  8. Dynamic Compressed Sensing of Unsteady Flows with a Mobile Robot Sachin Shriwastav, Gregory Snyder and Zhuoyuan Song
  9. A Framework for Optimized Topology Design and Leader Selection in Affine Formation Control Fan Xiao, Qingkai Yang, Xinyue Zhao and Hao Fang
  10. Blind as a bat: audible echolocation on small robots Frederike Dumbgen Adrien Hoffet Mihailo Kolundzija Adam Scholefield Martin Vetterli
    • Ecole Polytechnique Federale de Lausanne (EPFL)
    • IEEE xplore
  11. Safe Reinforcement Learning for Robot Control using Control Lyapunov Barrier Functions Desong Du, Shaohang Han, Naiming Qi and Wei Pan
    • Harbin Institute of Technology + TU Delft + University of Manchester
    • Late breaking result poster
  12. Parsing Indoor Manhattan Scenes Using Four-Point LiDAR on a Micro UAV Eunju Jeong, Suyoung Kang, Daekyeong Lee, and Pyojin Kim
    • Sookmyung Women’s University,
    • Late breaking result poster
  13. Interactive Multi-Robot Aerial Cinematography Through Hemispherical Manifold Coverage Xiaotian Xu , Guangyao Shi , Pratap Tokekar , and Yancy Diaz-Mercado
    • University of Maryland
    • Note: Only mention of Crazyflie experiments in presentation
  14. Safe-control-gym: a Unified Benchmark Suite for Safe Learning-based Control and Reinforcement Learning in Robotics Zhaocong Yuan, Adam W. Hall, Siqi Zhou, Lukas Brunke, Melissa Greeff, Jacopo Panerati, Angela P. Schoellig
  15. Distributed Geometric and Optimization-based Control of Multiple Quadrotors for Cable-Suspended Payload Transport Khaled Wahba and Wolfgang Hoenig
  16. Customizable-ModQuad: a Versatile Hardware-Software Platform to Develop Lightweight and Low-cost Aerial Vehicles Diego S. D’Antonio, Jiawei Xu, Gustavo A. Cardona, and David Saldaña

Let us know if we are missing any papers or information per papers! Once the IEEE xplore IROS 2022 proceedings have been published, we will update these too and put them on our research page.

This week’s guest blogpost is from Rik Bouwmeester from the Micro Air Vehicle lab, Faculty of Aerospace Engineering at the Delft University of Technology.

Tiny quadcopters like the Crazyflie can be operated in narrow, cluttered environments and in proximity to humans, making them the perfect candidate for search-and-rescue operations, monitoring of crop in a greenhouse, or performing inspections where other flying robots cannot reach. All these applications benefit from autonomy, allowing deployment without proximity to a base station or human operator and permitting swarming behavior.

Achieving autonomous navigation on nano quadcopters is challenging given the highly constrained payload and computational power of the platform. Most attention has been given to monocular solutions; the camera is a lightweight and energy-efficient passive sensor that captures rich information of the environment. One of the most important monocular visual cues is optical flow, which has been exploited on MAVs with higher payload for obstacle avoidance [1], depth estimation [2] and several bio-inspired methods for autonomous navigation [3–7].

Optical flow describes the apparent visual variations caused by relative motion between an observer and their surroundings. This rich visual cue contains tangled information of velocity and depth. However, calculating optical flow is expensive. The field of optical flow estimation is and has been for a couple of years dominated by convolutional neutral networks (CNNs). Despite efforts to find architectures of reduced size and latency [8-10], these methods are still highly computationally expensive, running at several to tens of FPS on modern desktop GPUs and requiring millions of parameters to run, rendering them incompatible with edge hardware.

To this end, we present “NanoFlowNet: Real-Time Dense Optical Flow on a Nano Quadcopter”, submitted to an international robotics conference, which introduces NanoFlowNet, a CNN architecture designed for real-time, fully on-board, dense optical flow estimation on the AI-deck.

CNN architecture

We adopt semantic segmentation CNN STDC-Seg [11] and modify it for optical flow estimation. The resulting CNN architecture may be considered “real-time” on desktop hardware, for deployment on edge devices such as a nano quadcopter the net must be significantly shrunk. We improve the latency of the architecture in three ways.

First, we redesign the key convolutional modules of the architecture, the Short-Term Dense Concatenate (STDC) module. By reordering the operations within the strided variant of the module, we save, depending on the location of the module within the architecture, from over 10% to over 50% of the MAC operations per module, while increasing the number of output filters with large receptive field size. A large receptive field size is desirable for optical flow estimation.

Second, inspired by MobileNets [12], we globally replace ‘regular’ convolutions with depthwise separable convolutions. Depthwise separable convolutions factorize a convolution into a depthwise and pointwise convolution, effectively reducing the calculational expense at a cost in representational capacity.

Third, we reduce the input dimensionality. We train and infer network on grayscale input images, reducing the required on-board memory for storing images by a factor 2/3. Any memory saved on the AI-deck’s L2 memory can be handed to AutoTiler for storing the CNN architecture, speeding up the on-board execution. Requiring more of a speed-up, we run the CNN on-board at a reduced input resolution of 160×112 pixels. Besides the speed-up through saved L2, reducing the input resolution makes all operations throughout the network cheaper. We downscale training data to closely match the target resolution. Both these changes come at a loss of input information. We will miss out on small objects and small displacements that are not captured by the resolution.

To give some intuition of the available memory: Estimating optical flow requires two input images. Storing two color input images at full resolution requires (2 x 324x324x3=) 630 kB. The AI-deck has 512 kB of L2 memory available.

Motion boundary detail guidance

Inspired by STDC-Seg, we guide the training of optical flow with a train-time-only auxiliary task to promote the encoding of spatial information in the early layers. Specifically, we introduce a motion boundary prediction task to the net. The motion boundary ground truth can be found in the optical flow datasets. This improves performance by 0.5 EPE on the MPI Sintel clean (train) benchmark, at zero cost to inference latency.

Performance on MPI Sintel

Given the scaling and conversion to grayscale of input data, our network is not directly comparable with results reported by other works. For comparison, we retrain one of the fastest networks in literature, Flownet2-s [13], on the same data. Given the reduction in resolution, we drop the deepest two layers to maintain a reasonable feature size. We name the model Flownet2-xs.

We benchmark the performance of the architecture on the optical flow dataset MPI Sintel. NanoFlowNet performs better than FlowNet2-xs, despite using less than 10% of the parameters. NanoFlowNet achieves 5.57 FPS on the AI-deck. FlowNet2-xs does not fit on the AI-deck due to the network size. To put the achieved latency of NanoFlowNet in perspective, we execute FlowNet2-xs’ first two convolutions and the final prediction layer on the GAP8. The three-layer architecture achieves 4.96 FPS, which is slower than running the entire NanoFlowNet. On a laptop GPU, the two architectures accomplish similar latency.

MethodMPI Sintel (train) [EPE]Frame rate [FPS]Parameters
CleanFinalGPUGAP8
FlowNet2-xs9.0549.4581501,978,250
NanoFlowNet7.1227.9791415.57170,881
Performance on MPI Sintel (train subset). (Average) end-to-end Point Error (EPE) describes how far off the estimated flow vectors are on average, lower is better.

Obstacle avoidance implementation

We demonstrate the effectiveness of NanoFlowNet by implementing it in a simple, proof-of-concept obstacle avoidance application on an AI-deck equipped Crazyflie. We let the quadcopter fly forward at constant velocity and implement the horizontal balance strategy [14], [15], where the quadcopter balances the optical flow in the left and right half plane by yawing.

We equip a Crazyflie with the Flow deck for positioning only. The total flight platform weighs 34 grams.

We augment the balance strategy by implementing active oscillations (a cyclic up-down movement), resulting in additional optical flow generated across the field of view. This is particularly helpful for avoiding obstacles in the direction of horizontal travel, since no optical flow is generated at the focus of expansion.

The obstacle avoidance implementation is demonstrated in an open and a cluttered environment in ‘the Cyber Zoo’, an indoor flight arena at the faculty of Aerospace Engineering at the Delft University of Technology. The control algorithm is most robust in the open environment, with the quadcopter managing to drain a full battery without crashing. In the cluttered environment, performance is more variable. Especially on occasions where obstacles are close to one another, the quadcopter tends to avoid the first obstacle successfully, only to turn straight into the second and crash into it. Adding a head-on collision detection based on FOE detection and divergence estimation (e.g., [7]) should help avoid obstacles in these cases.

Successful run in a cluttered environment in the Cyber Zoo. The Crazyflie manages to avoid collision until the battery is drained.

All in all, we consider the result a successful demonstration of the optical flow CNN. In future work, we expect to see applications that take more advantage of the resolution of the flow information.

Citation

Bouwmeester, R. J., Paredes-Vallés, F., De Croon, G. C. H. E. (2022). NanoFlowNet: Real-time Dense Optical Flow on a Nano Quadcopter. arXiv. https://doi.org/10.48550/arXiv.2209.06918

References

[1] Gao, P., Zhang, D., Fang, Q., & Jin, S. (2017). Obstacle avoidance for micro quadrotor based on optical flow. Proceedings of the 29th Chinese Control and Decision Conference, CCDC 2017, 4033–4037. https://doi.org/10.1109/CCDC.2017.7979206

[2] Sanket, N. J., Singh, C. D., Ganguly, K., Fermuller, C., & Aloimonos, Y. (2018). GapFlyt: Active vision based minimalist structure-less gap detection for quadrotor flight. IEEE Robotics and Automation Letters, 3(4), 2799–2806. https://doi.org/10.1109/LRA.2018.2843445

[3] Conroy, J., Gremillion, G., Ranganathan, B., & Humbert, J. S. (2009). Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Autonomous Robots, 27(3), 189–198. https://doi.org/10.1007/s10514-009-9140-0

[4] Zingg, S., Scaramuzza, D., Weiss, S., & Siegwart, R. (2010). MAV navigation through indoor corridors using optical flow. Proceedings – IEEE International Conference on Robotics and Automation, 3361–3368. https://doi.org/10.1109/ROBOT.2010.5509777

[5] De Croon, G. C. H. E. (2016). Monocular distance estimation with optical flow maneuvers and efference copies: A stability-based strategy. Bioinspiration and Biomimetics, 11(1). https://doi.org/10.1088/1748-3190/11/1/016004

[6] Serres, J. R., & Ruffier, F. (2017). Optic flow-based collision-free strategies: From insects to robots. Arthropod Structure and Development, 46(5), 703–717. https://doi.org/10.1016/j.asd.2017.06.003

[7] De Croon, G. C. H. E., De Wagter, C., & Seidl, T. (2021). Enhancing optical-flow-based control by learning visual appearance cues for flying robots. Nature Machine Intelligence, 3(1), 33–41. https://doi.org/10.1038/s42256-020-00279-7

[8] Ranjan, A., & Black, M. J. (2017). Optical flow estimation using a spatial pyramid network. Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 2720–2729. https://doi.org/10.1109/CVPR.2017.291

[9] Hui, T. W., Tang, X., & Loy, C. C. (2018). LiteFlowNet: A Lightweight Convolutional Neural Network for Optical Flow Estimation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8981–8989. https://doi.org/10.1109/CVPR.2018.00936

[10] Sun, D., Yang, X., Liu, M. Y., & Kautz, J. (2017). PWC-Net: CNNs for Optical Flow Using Pyramid, Warping, and Cost Volume. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8934–8943. https://doi.org/10.1109/CVPR.2018.00931

[11] Fan, M., Lai, S., Huang, J., Wei, X., Chai, Z., Luo, J., & Wei, X. (2021). Rethinking BiSeNet For Real-time Semantic Segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 9711–9720. https://doi.org/10.1109/CVPR46437.2021.00959

[12] Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., & Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. In arXiv. arXiv. http://arxiv.org/abs/1704.04861

[13] Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., & Brox, T. (2017). FlowNet 2.0: Evolution of optical flow estimation with deep networks. Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 1647–1655. https://doi.org/10.1109/CVPR.2017.179

[14] Souhila, K., & Karim, A. (2007). Optical flow based robot obstacle avoidance. International Journal of Advanced Robotic Systems, 4(1), 2. https://doi.org/10.5772/5715

[15] Cho, G., Kim, J., & Oh, H. (2019). Vision-based obstacle avoidance strategies for MAVs using optical flows in 3-D textured environments. Sensors, 19(11), 2523. https://doi.org/10.3390/s19112523

I already talked about it here and there, but this day finally came: the whole company is in Japan !
Kimberly travelled first, to account for jetlag, meet with some people, and attend ROScon.

It was last week, and she got the opportunity to learn a lot, meet people from the ROS community, and give an exciting talk.

Kimberly’s talk at RosCon (made by Ramón Roche)
Happy to be in Japan (Made by Ramón Roche)

The rest of the company travelled last week with all the equipment needed divided into our suitcases.

Our suitcases at the office, to gather the materials before going

We chose to rent a traditional machiya while there, where we can all stay together and enjoy the life in the center of Kyoto.

Us chilling out in the Bitcraze mansion

Our first day here was to account for jetlag, but we managed to sightsee the amazing sites of Kyoto – and enjoy the most praised Japanese food, much appreciated after a long walk among the Tori gates of the Fushimi Inari shrine.

Us after climbing on top of Mt Inari – with the beautiful path of Tori gates

But it was soon time to start working, and yesterday we worked really hard on setting up everything to have a nice demo at IROS.

After some head scratching, emergency taping and hacking we managed to get the autonomous demo that Marios implemented last summer flying – just before the event hall We got time to explore the Kyoto International Conference Center, a beautiful venue with a Japanese garden and a futuristic look – as imagined in the 70′.

Some views from the Kyoto Conference Center

We invited those of you that are attending IROS to come and see us for a tech meet-up. It’s today and it would be a real nice opportunity for us to finally chat in person with our users ! Since there are a lot of aerial systems talks, we realize it may be difficult to come during the sessions, so the tech meet-up can begin during the break, at 15.40

Next up this week is the safe nanocopter competition. Kimberly will actually deliver the prize for that, we can’t wait to see what this competition will show – and how fun it is to remote-control the Crazyflies that are in the University of Toronto Institute for Aerospace Studies!

Of course, we will share some news on social media – and we will have a blogpost in a few weeks to debrief on the whole trip.

As you’ll understand, maintaining the day-to-day of the company is a little trickier this week, but we still monitor email, github discussions, and are shipping orders. You should just expect a longer time to process those, as we’re too busy – either at the booth or… at karaoke ! (no, there will be no videos of us singing).

This fall is full of exciting events for us, and none are more excitedly expected than our visit to Japan. Yes, the whole company (6 people) are travelling to Kyoto for at least a week – but not for sightseeing (well, not only). Here is what we have planned:

ROSCon

As per tradition, ROSCon is held shortly before IROS. So, on the 19 to 21 October, Kimberly will be here to represent us along the ROS community. She will even have a presentation about the latest ROS2 integrations in collaboration with the maintainers of Crazyswarm2. It’s on October 21st, 16.50 local time so if you’re there make sure to hear her talk !

IROS

From the 21st to the 27th of October, IROS will be held at the Kyoto International Conference Center. it’s one of largest robotics conference worldwide, with almost 1750 papers presented. As the first in-person session of the conference since the beginning of the Covid pandemic, we had to be there. We will man the booth during the whole conference, with the demo our intern Marios has worked on a lot. And since it’s been a long time since we’ve been able to gather and talk together, we thought it would be great to have an official meetup at IROS for those interested.

So, please note this official invitation to Bitcraze’s tech meetup at IROS! If you’re at IROS and want to meet us together with other Crazyflie users, then let’s get together on Monday 24th of October at 16.00 at our booth 59. It’s the perfect occasion to (re)connect, to get the latest news about Bitcraze, to talk about development, share what you’ve been doing and even possibly hack together! Be sure to say hi if you’re there. We will try to make it something similar to a Swedish fika, with some sweets and coffee, but we can’t promise that there will be kanelbullar.

IROS Safe Robot Learning Competition

And this year, we’re happy to announce that there will be a Crazyflie competition during IROS. The goal is to develop safe learning-based algorithms that can cope with uncertainties not known at design time. Our friends at Dynamic Systems Lab are organizing this competition with two simulated phase, and one experimental phase at IROS… And the experiment is a remote access to the Flight Arena at the University of Toronto Institute for Aerospace Studies in Toronto, Canada via high-speed internet connections. You don’t need to be present at IROS to participate, but if you wish to do so, beware, the registration for the competition ends on October 12th. We’re really curious and excited to see what this competition is going to show!

What about Bitcraze during that week?

But, if everybody is in Japan, what about Bitcraze’s regular activities ? You may be wondering. Well, no worries. Even though we’re going to be half a world away, the business is going to follow us. Of course, some of us are going to take that opportunity to take some vacations and visit this beautiful country, so during IROS’ week and the week after, the company will run a little bit more slowly than usual. We won’t be as reactive as usual on emails and discussions, but we will still monitor our emails and ship some orders.

Are you planning to visit IROS or ROSCon ? Is there anything in particular in the schedule that you don’t want to miss ? Don’t hesitate to tell us if you want to join the meetup !

After a period of bitcrazer-vacations, we are now all back at work. The summer here in Sweden has generally been great. Some of us stayed here to keep the company afloat, and some just stayed afloat on lakes or the sea. The majority vacationed inside of Sweden, but some (could you guess who?) have visited France, Italy, or Greece. We’ve been lucky with a mostly warm and sunny weather, perfect for bathing and grilling. And even though it’s nice to enjoy real summer, it’s still worrying sign though, as Europe is experiencing what could be the worst drought in 500 years.

Crazyflie 2.1 back in stock

What is also back is the Crazyflie 2.1, but back in stock, yay! After almost two weeks without any drones available for sale, we received a new batch of our quadcopter today. It should now be available in the shop, just in time for when school starts!

We got some indications the component shortage are slowly moving in the right direction so hopefully it will get easier to keep things in stock in the future. We are keeping our fingers crossed.

Bolt 1.1 ESC cable red/black switched

Unfortunately we recently found out that there has been a manufacturing error with the ESC cables that come with the Bolt 1.1. The black and red cables have been switched. Please see the image below.

With the black and red cables switched this will result in powering your ESCs with reversed polarity. This will most likely burn the MOSFET on the Bolt that controls the power to the ESC, which is the weakest link. This because the MOSFET body diodes on the ESC will conduct and make the whole ESC a short circuit. In many setups, e.g using 4in1 ESC these cables are not used though and will not cause a problem.

Switching the cables back is quite easy to do. Use a needle, tweezer or e.g. small screwdriver to open the plastic lock so the cable can be pulled out. Switch the black and red and you are done. You can double check that the colors are correct by comparing it with the Bolt 1.1 board. The plus and minus should match with the red and black as per the image below:

We are currently working with the manufacturer to get correct cables. If you got a Bolt 1.1 (anytime between June and August 2022) we can of course ship you correct cables once they are ready or give you support if you got problems with the control board. If so, please send us an email to support@bitcraze.io. Sorry for this inconvenience!

Now it is time to give a little update about the ongoing ROS2 related projects. About a month ago we gave you an heads-up about the Summer ROS2 project I was working on, and even though the end goal hasn’t been reached yet, enough has happened in the mean time to write a blogpost about it!

Crazyflie Navigation

Last time showed mostly mapping of a single room, so currently I’m trying to map a bigger portion of the office. This was initially more difficult then initially anticipated, since it worked quite well in simulation, but in real life the multi-ranger deck saw obstacles that weren’t there. Later we found out that was due to this year old issue of the multi-ranger’s driver incapability to handle out-of-range measurements properly (see this ongoing PR). With that, larger scale mapping starts to become possible, which you can see here with the simple mapper node:

If you look at the video until the end, you can notice that the map starts to diverge a bit since the position + orientation is solely based on the flow deck and gyroscopes , which is a big reason to get the SLAM toolbox to work with the multi-ranger. However, it is difficult to combine it with such a sparse ‘Lidar’ , so while that still requires some tuning, I’ve taken this opportunity to see how far I get with the non-slam mapping and the NAV2 package!

As you see from the video, the Crazyflie until the second hallway. Afterwards it was commanded to fly back based on a NAV2 waypoint in RViz2. In the beginning it seemed to do quite well, but around the door of the last room, the Crazyflie got into a bit of trouble. The doorway entrance is already as small as it is, and around that moment is also when the mapping started to diverge, the new map covered the old map, blocking the original pathway back into the room. But still, it came pretty close!

The diverging of the map is currently the blocker for larger office navigation, so it would be nice to get some better localization to work so that the map is not constantly changed due to the divergence of position estimates, but I’m pretty hopeful I’ll be able to figure that out in the next few weeks.

Crazyflie ROS2 node with CrazySwarm2

Based on the poll we set out in the last blogpost, it seemed that many of you were mostly positive for work towards a ROS2 node for the Crazyflie! As some of you know, the Crazyswarm project, that many of you already use for your research, is currently being ported to ROS2 with efforts of Wolfgang Hönig’s IMRCLab with the Crazyswarm2 project. Instead of in parallel creating separate ROS2 nodes and just to add to the confusion for the community, we have decided with Wolfgang to place all of the ROS2 related development into Crazyswarm2. The name of the project will be the same out of historical reasons, but since this is meant to be the standard Crazyflie ROS2 package, the names of each nodes will be more generic upon official release in the future.

To this end, we’ve pushed a cflib python version of the crazyflie ros2 node called crazyflie_server_py, a bit based on my hackish efforts of the crazyflie_ros2_experimental version, such that the users will have a choice of which communication backend to use for the Crazyflie. For now the node simply creates services for each individual Crazyflie and the entire swarm for take_off, land and go_to commands. Next up are logging and parameter handling, positioning support and broadcasting implementation for the CFlib, so please keep an eye on this ticket to see the process.

So hopefully, once the summer project has been completed, I can start porting the navigation capabilities into the the Crazyswarm2 repository with a nice tutorial :)

ROScon talk

As mentioned in a previous blogpost, we’ll actually be talking about the Crazyflie ROS2 efforts at ROScon 2022 in Kyoto in collaboration with Wolfgang. You can find the talk here in the ROScon program, so hopefully I’ll see you at the talk or the week after at IROS!

This week we have a guest blog post from Jiawei Xu and David Saldaña from the Swarmslab at Lehigh University. Enjoy!

Limits of flying vehicles

Advancements in technology have made quadrotor drones more accessible and easy to integrate into a wide variety of applications. Compared to traditional fixed-wing aircraft, quadrotors are more flexible to design and more suitable for motioning, such as statically hovering. Some examples of quadrotor applications include photographers using mounting cameras to take bird’s eye view images, and delivery companies using them to deliver packages. However, while being more versatile than other aerial platforms, quadrotors are still limited in their capability due to many factors. 

First, quadrotors are limited by their lift capacity, i.e., strength. For example, a Crazyflie 2.1 is able to fly and carry a light payload such as an AI deck, but it is unable to carry a GoPro camera. A lifter quadrotor that is equipped with more powerful components can transport heavier payload but also consumes more energy and requires additional free space to operate. The difference in the strength of individual quadrotors creates a dilemma in choosing which drone components are better suited for a task.

Second, a traditional quadrotor’s motion in translation is coupled with its roll and pitch. Let’s take a closer look at Crazyflie 2.1, which utilizes a traditional quadrotor design. Its four motors are oriented in the same direction – along the positive z-axis of the drone frame, which makes it impossible to move horizontally without tilting. While such control policies that convert the desired motion direction into tilting angles are well studied, proven to work, and implemented on a variety of platforms [1][2], if, for instance, we want to stack a glass filled with milk on top of a quadrotor and send it from the kitchen to the bedroom, we should still expect milk stains on the floor. This lack of independent control for rotation and translation is another primary reason why multi-rotor drones lack versatility.

Fig 1. A crazyflie has four propellers generating thrust forces in parallel. Credit to: https://robots.ros.org/crazyflie/

Improving strength

These versatility problems are caused by the hardware of a multi-rotor drone designed specifically to deal with a certain set of tasks. If we push the boundary of these preset tasks, the requirements on the strength and controllability of the multi-rotor drone will eventually be impossible to satisfy. However, there is one inspiration we take from nature to improve the versatility in the strength of multi-rotor drones – modularity! Ants are weak individual insects that are not versatile enough to deal with complex tasks. However, when a group of ants needs to cross natural boundaries, they will swarm together to build capable structures like bridges and boats. In our previous work, ModQuad [3], we created modules that can fly by themselves and lift light payloads. As more ModQuad modules assemble together into larger structures, they can provide an increasing amount of lift force. The system shows that we can combine weak modules with improving the versatility of the structure’s carrying weight. To carry a small payload like a pin-hole camera, a single module is able to accomplish the task. If we want to lift a heavier object, we only need to assemble multiple modules together up to the required lift.

Improving controllability

On a traditional quadrotor, each propeller is oriented vertically. This means the device is unable to generate force in the horizontal direction. By attaching modules side by side in a ModQuad structure, we are aligning more rotors in parallel, which still does not contribute to the horizontal force the structure can generate. That is how we came up with the idea of H-ModQuad — we would like to have a versatile multi-rotor drone that is able to move in an arbitrary direction at an arbitrary attitude. By tilting the rotors of quadrotor modules and docking different types of modules together, we obtain a structure whose rotors are not pointing in the same direction, some of which are able to generate a force along the horizontal direction.

H-ModQuad Design

H-ModQuad has two major characteristics: modularity and heterogeneity, which can be indicated by the “Mod” and “H-” in the name. Modularity means that the vehicle (we call a structure) is composed of multiple smaller modules which are able to fly by themselves. Heterogeneity means that we can have modules of different types in a structure. 

As mentioned before, insects like ants utilize modularity to enhance the group’s versatility. Aside from a large number of individuals in a swarm that can adapt to the different scales of the task requirement, the individuals in a colony specializing in different tasks are of different types, such as the queen, the female workers, and the males. The differentiation of the types in a hive helps the group adapt to tasks of different physical properties. We take this inspiration to develop two types of modules.

In our related papers [4][5], we introduced two types of modules which are R-modules and T-modules.

Fig 2. Major components of an H-ModQuad “T-module” we are using in our project. We use Bitcraze Crazyflie Bolt as the central control board.

An example T-module is shown in the figure above. As shown in the image, the rotors in a T-module are tilted around its arm connected with the central board. Each pair of diagonal rotors are tilted in the opposite direction, and each pair of adjacent rotors are either tilting in the same direction or in the opposite direction. We arrange the tilting of the rotors so that all the propellers generate the same thrust force, making the structure torque-balanced. The advantage of the T-module is that it allows the generation of more torque around the vertical axis. One single module can also generate forces in all horizontal directions.

An R-module has all its propellers oriented in the same direction that is not on the z-axis of the module. In this configuration, when assembling multiple modules together, rotors from different modules will point in different directions in the overall structure. The picture below shows a fully-actuated structure composed of R-modules. The advantage of R-modules is that the rotor thrusts inside a module are all in the same direction, which is more efficient when hovering.

Structure 1: Composed of four types of R-modules.

Depending on what types of modules we choose and how we arrange those modules, the assembled structure can obtain different actuation capabilities. Structure 1 is composed of four R-modules, which is able to translate in horizontal directions efficiently without tilting. The picture in the intro shows a structure composed of four T-modules of two types. It can hover while maintaining a tilting angle of up to 40 degrees.

Control and implementation

We implemented our new geometric controller for H-ModQuad structures based on Crazyflie Firmware on Crazyflie Bolt control boards. Specifically, aside from tuning the PID parameters, we have to change the power_distribution.c and controller_mellinger.c so that the code conforms to the structure model. In addition, we create a new module that embeds the desired states along predefined trajectories in the firmware. When we send a timestamp to a selected trajectory, the module retrieves and then sends the full desired state to the Mellinger Controller to process. All modifications we make on the firmware so that the drone works the way we want can be found at our github repository. We also recommend using the modified crazyflie_ros to establish communication between the base station and the drone.

Videos

Challenges and Conclusion

Different from the original Crazyflie 2.x, Bolt allows the usage of brushless motors, which are much more powerful. We had to design a frame using carbon fiber rods and 3-D printed connecting parts so that the chassis is sturdy enough to hold the control board, the ESC, and the motors. It takes some time to find the sweet spot of the combination of the motor model, propeller size, batteries, and so on. Communicating with four modules at the same time is also causing some problems for us. The now-archived ROS library, crazyflie_ros, sometimes loses random packages when working with multiple Crazyflie drones, leading to the stuttering behavior of the structure in flight. That is one of the reasons why we decided to migrate our code base to the new Crazyswarm library instead. The success of our design, implementation, and experiments with the H-ModQuads is proof of work that we are indeed able to use modularity to improve the versatility of multi-rotor flying vehicles. For the next step, we are planning to integrate tool modules into the H-ModQuads to show how we can further increase the versatility of the drones such that they can deal with real-world tasks.

Reference

[1] D. Mellinger and V. Kumar, “Minimum snap trajectory generation and control for quadrotors,” in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 2520–2525.

[2] T. Lee, M. Leok, and N. H. McClamroch, “Geometric tracking control of a quadrotor uav on se(3),” in 49th IEEE Conference on Decision and Control (CDC), 2010, pp. 5420–5425.

[3] D. Saldaña, B. Gabrich, G. Li, M. Yim and V. Kumar, “ModQuad: The Flying Modular Structure that Self-Assembles in Midair,” 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 691-698, doi: 10.1109/ICRA.2018.8461014.

[4] J. Xu, D. S. D’Antonio, and D. Saldaña, “Modular multi-rotors: From quadrotors to fully-actuated aerial vehicles,” arXiv preprint arXiv:2202.00788, 2022.

[5] J. Xu, D. S. D’Antonio and D. Saldaña, “H-ModQuad: Modular Multi-Rotors with 4, 5, and 6 Controllable DOF,” 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 190-196, doi: 10.1109/ICRA48506.2021.9561016.

We’re now in the middle of summer, and even though we’re not affected by the heat much here in Sweden, we’re still in a slower pace as usual, since a lot of us are not at the office. Sales, packing, support and general maintenance takes up a lot of our time for those that are left at the office. We also usually take the summer time to clear out lingering issues and focus on some projects that we can tackle alone.

This summer though will be mostly used for preparation of a very busy autumn. As the Covid situation seems to normalize around the world, conferences onsite are restarting, and we plan to take advantage of this ! Here is what is planned:

IMAV – Delft, 12 to 16 September.

The 13th edition of the International Micro Air Vehicle Conference will be held in Delft, in the Netherlands. We’ve been collaborating for a long time with the MAVLab in Delft, so we’re really happy to be one of the sponsors for this conference. For the occasion, there is a nano AI competition that we’re really excited to see. With the AI bundle, the goal is to fly as fast as possible through an obstacle course.

We’ve been working a lot with the AI deck this past year, so this competition is the perfect occasion for us to see it in action. Kimberly has also developed a simulator that will be used for this competition.

ROSCon – Kyoto, 19 to 21 october

ROSCon is a conference dedicated to the entire ROS community, traditionally held right before IROS. Kimberly will be our proud represent there, as she will have a talk about ROS2 and the Crazyflie. For the occasion, she will showcase the latest ROS2 integrations in collaboration with the maintainers of Crazyswarm2.

Last time a Crazyflie was present at ROSCon was in 2015, where Wolfgang Hönig had a lightning talk. A lot has changed since that time, and we’re hoping to increase the presence of (tiny) aerial vehicles within the ROS community, especially nanocopters like the Crazyflie.

IROS – Kyoto, 23 to 27 october

IROS is one of the largest robotics conferences worldwide, and after an online edition last year, this 35th instance promises to be full of exciting things!

As it’s quite huge, and for a quite delayed 10th Bitcraze’s anniversary, the whole company plans to get to this conference. Not only for the chance to discover Japan, that most of us haven’t visited, but also because it feels important to have a significant presence in this conference, which promises a lot of opportunities. That would mean a week without anyone at the Swedish office, but you know where to find us if you would like to talk to us ;).

For the occasion, our intern Marios is working on revamping the autonomous swarm demo. Because of the pandemic, it’s been a while since we actually used it for a whole day of flying, and he’s actively working on making it completely autonomous by implementing the peer to peer protocol.

Logistics

As you can see, those exciting 3 conferences almost back-to-back promise a busy autumn here at Bitcraze. There’s a lot to prepare ahead of time, like marketing materials, demo setups, visas problems and hotel bookings. And there will be a lot to talk about, during and after. The pandemics have delayed a lot of our in-person meetings, and it will feel really good to finally get to meet up in the real world with users – old and new. If you have the opportunity, don’t hesitate to come by our booths on those conferences and say hello in person!

As the Crazyflie ecosystem expands, more and more novel aerial (but also ground or hybrid) robots are being built with one of the Crazyflie controllers onboard. For recent examples, you can check e.g. the recent blogpost about ICRA 2022.

In this post, I will introduce yet another Crazyflie-Bolt-powered aerial robot, the Flapper Nimble+ from our company Flapper Drones, which unlike other flying robots doesn’t have any propellers but uses flapping wings instead.

The best aerial robot design is…

Small drones, or micro air vehicles, have seen a lot of progress and new developments in the last 20 years. The most widespread design nowadays is a quadcopter, such as the Crazyflie 2.1. But is a quadcopter the ultimate (micro) drone solution? At Flapper Drones, we believe nature might provide even better designs… For some applications at least! 😊

Flying like a bird…

Flapper Drones is a spinoff of the MAVLab of the Delft University of Technology. At the MAVLab, we have been researching bio-inspired flight as part of the DelFly project since 2005. From the beginning, the goal has been to develop a lightweight, mission capable micro air vehicle, the design of which would draw inspiration from nature. Over the years, many such MAV concepts have been designed, built and tested, including the DelFly Micro, the world’s smallest camera-equipped MAV, or the DelFly Explorer, the first autonomous flapping-wing MAV equipped with a stereovision system. All these designs were propelled by a pair of flapping wings, while being controlled (and passively stabilized) by a tail such as birds or men-made airplanes.

… or an insect

The latest design, the DelFly Nimble is insect-inspired instead. What does that mean? The Nimble has no tail, which would provide the damping needed for stable flight. Instead, it is stabilized actively, by adjustments of the motion of its flapping wings. This is what all flying insects and also hummingbirds do. Flies, for example, sense their body motions with their halteres, drum-stick like biological gyroscopes, and adapt their wing motion accordingly to stay balanced…. or to be agile, when someone is trying to swat them!

And while the Nimble was originally built just to demonstrate that an insect-inspired flying robot can be built, eventually we could also use it to learn more about the flight of insects:

Flapper Drones – how do they work?

The Flapper Nimble+ is the commercial (and enlarged) version of the DelFly Nimble, developed and produced by Flapper Drones. To our knowledge, it is the first, and so far the only hover-capable tailless flapping-wing drone available!

The thrust keeping the Nimble+ airborne is created by its four flapping wings, which flap back and forth horizontally, about 10 to 12 times per second.

The wing actuation mechanism allows to adjust the flapping frequency of the left and right wing pairs independently, which enables control of the roll rotation. Pitch rotation is controlled by adjusting the mean wing position within the stroke plane, which shifts the mean thrust force forward or backward with respect to the center of mass, and also introduces a stabilizing dihedral angle in forward flight. Finally, yawing motion is achieved by tilting the wing roots of the left and right wing pair asymmetrically:

Advantages of flapping wings

The use of flapping-wing drones such as the Flapper Nimble+ brings several advantages. Next to their attractive biological appearance, the soft flapping wings produce less intrusive, low frequency sound and are safer, compared to propellers. As the wings move back and forth, minor mid-air collisions are not a problem. The wings bounce off objects leaving no damage, and the drone keeps flying as this only represents a minor disturbance:

The aerial drag characteristic is also different and helps with precise indoor flight. As soon as zero attitude is commanded, the Nimble+ goes into halt in a matter of several wingbeats, making it an ideal choice for novice drone pilots as well as in constrained or cluttered indoor spaces. Finally, the flapping wings can provide additional lift force as they also glide in forward flight. This can improve the power efficiency by over 20%, compared to hovering.

Otherwise, Flapper Drones can be operated as any other drone, with vertical take offs and landings, quick maneuvers and flight in any direction:

Crazyflie Bolt & compatibility

The Flapper Nimble+ is powered by the Crazyflie Bolt 1.1, where the Bolt’s BMI088 IMU and STM32F4 MCU are suitable substitutes to the halteres and brains of the real fly. We made this choice, because this enables compatibility with most of the Crazyflie ecosystem, but also, because we felt the only way a Crazyflie would do justice to its name is if it had flapping wings😊

Currently, the Nimble+ uses a fork of the Crazyflie firmware, which is of course open source. Moreover, with the recently introduced platform functionality, we will be able to include the Flapper platform into the official crazyflie firmware very soon (expected still in July 2022). This means that the Flapper remains compatible with the official Python libraries, the PC client or the smartphone app. But also third-party projects like the Crazyswarm or the Skybrush should only require minor adjustments, if any, to operate a swarm of Flappers. Thus, for the existing Crazyflie users, switching from a Crazyflie to the Flapper should be a breeze!

The Flapper Nimble+ is hardware compatible with most of the Crazyflie expansion decks. While software support remains experimental (the Flapper Nimble+ is not a native Crazyflie product, after all), many of the decks work out of the box and others might need just minor firmware modifications. Would you like to fly the Nimble+ autonomously? Add an LPS or Lighthouse deck and you’re good to go!

For more details regarding deck compatibility, you can check this overview.

Applications

While the Nimble+ was originally designed for drone shows and similar entertainment applications, the open-source firmware and expansion decks enabling autonomous flight make it ideal also as for academic research and, in general, as a development platform. Are you researching swarming, and would you like to make your swarm even more bio-inspired? Are you developing new sensors, or new controllers (possibly even bioinspired), which you would like to test on a new type of flying platform? Are you interested in the aerodynamics of flapping wings, or the flight dynamics of insect-like flight? Or are you just curious and would you like to learn more about bioinspired flight? In all these cases, a Flapper might be what you are looking for!

The 114-g and 49-cm wide Flapper Nimble+ has been designed as a modular system where any part can easily be replaced. Flapper Drones provides all the spares, which are available upon request. If you are interested in using the Nimble+ for entertainment, rather than research, you can modify the appearance by creating your own body shells, which can also be illuminated by RGB Leds (a suitable interface and power supply is already integrated). Or even by altering the design of the wings. Finally, you can easily extend the Flapper with your own sensors, or other devices. Would you like to add a tail? A gripper? A perching device? This is all possible, as long as these additions fit within the payload limit of about 25 grams.

Available soon in the webstore!

Did you get (bio)inspired, and would you like to try an insect-like flying robot yourself? Then we have some good news! The Flapper Nimble+ will soon be available for sale in an exclusive partnership with Bitcraze and their webstore. Checkout the product description and leave your email address behind, such that you get a notification when the Flappers are in stock and ready to ship. The first batch of 10 units is expected to be available at the end of summer, so do not wait too long 😉

Want to learn more?

To learn more about Flapper Drones, you can check our website, or watch the talk I gave at the last miniBAM: