Blog

Guest post by Dominik Grzelak, Dresden University of Technology, Germany

In my initial post about XR-PALS, I introduced a tool on reducing setup time for the Loco Positioning System (LPS). Over time, one very practical aspect remained: handling and mounting the LPS nodes themselves.

In this post, I would like to share a 3D-printable enclosure and mounting system for LPS nodes that emerged from daily use and was developed hand-in-hand with a technical designer friend. The goal was to make LPS nodes faster to deploy, and more robust in general; they have already withstood drops from heights of up to 2 m.

The 3D printable files are available here: https://github.com/bitcraze/bitcraze-mechanics/tree/master/LPS-node-case

Overview

The result is a two-part enclosure with dedicated mounting adapters, designed specifically around the LPS node form factor:

  • Front shell: Protective enclosure with power button
  • Back shell: Securely holds the LPS node PCB
  • Mounting adapters: Enable quick mounting on poles or tripods in vertical or horizontal orientation

Assembly Instructions

Place the back part as shown in the figure to print without supports. Rotate the front part by 180° relative to the figure to print without supports. The adapter should be printed standing upright.

Below are a few notes to get you started building the case:

  • Print all three parts and buy the additional off-the-shelf hardware components
  • Place the LPS node into the back part of the case.
  • Install the toggle switch and ensure correct orientation.
  • Connect and solder the USB power cable wires.
  • Lay down the USB cable on top of the designated notch in the back part (tie a small knot to release the tension).
  • Attach the front part and secure it with screws.
  • Use the adapter (“slider”) to mount the case on a tripod or pole (with cable ties, for example).
Make sure the red and black cables are wired correctly.

The USB power bank itself serves as the power indicator for the LPS node.

Evolution of the Case

The design itself went through several iterations with adjustments.

In parallel, multiple design variations were explored to evaluate different approaches to mounting, cable routing, and overall form factor. Attention was paid to ensuring the parts print cleanly on common 3D printers without supports.

This process helped smooth out small usability issues and resulted in a design that is both easy to print and straightforward to use in practice.

Vertical and Horizontal Mount Adapter

While the standard configuration mounts the enclosure vertically, a horizontal holder adapter has now been introduced. This provides additional flexibility depending on the experimental setup.

A short demonstration of the mounting system can be viewed here.

Conclusion

This enclosure and mounting system grew out of repeated practical use in changing indoor environments, and we hope it will be useful to others as well.

Feedback and ideas are always welcome. And by the way, if you print your own version, feel free to share photos of your setup!

Booth #90 is running. Here’s what we’re showing and why we think it’s relevant to the conversations happening at the European Robotics Forum this week.

A Decentralized Brushless Swarm

The centerpiece is an evolution of the Decentralized Brushless Swarm demo we published last year. Multiple Crazyflie 2.1 Brushless drones share a volume with no central trajectory planner. Each agent handles its own state estimation, neighbor awareness, and collision avoidance independently. The swarm is fault-tolerant by design: individual failures don’t cascade.

What makes this relevant as a testbed is not the flight itself, but what it lets you study. Decentralized coordination, emergent behavior, and the gap between simulated and physical multi-agent dynamics are all things you can actually probe here, at a scale and cost that makes iteration realistic.

The Swarming Interface

We’re showing a new interface for the first time at ERF. It surfaces per-agent state in real time: position, velocity, battery, role, giving you visibility into what the swarm is doing and why, not just the flight envelope. We’ll write up the technical details separately, but if you want to see it running, the booth is the right place.

A Touch of Magic!

We have built a magic wand. It is a Lighthouse-based device that lets you grab a drone, or a group of them, and steer with your hand. It started as a side project and ended up being a surprisingly good way to demonstrate how the positioning system responds to real-time input. Worth a look if you’re nearby.

Come Find Us

We’re at booth #90 through Thursday March 27. The conversations we’re most interested in are about research infrastructure: how teams design testbeds, what the handoff from simulation to hardware looks like in practice, and where small-scale indoor platforms fit into larger development pipelines.

If you’d like to set aside time for a more focused discussion, reach out at contact@bitcraze.io or the https://www.b2match.com/e/erf2026/meetings app.

During my first Fun Friday as a Bitcraze intern in 2021, I discovered the musical note definitions in the Crazyflie firmware and thought about creating a musical performance using the Crazyflie’s motors, but never followed through.

A few weeks ago I decided to finally take it on as a Fun Friday Project with the slightly more ambitious goal of playing music across several Crazyflies at once.

crazyflie-jukebox takes a MIDI file, preprocesses it into motor frequency events, then uploads and plays the song by spinning the motors accordingly. Each Crazyflie contributes 4 voices (one per motor), so polyphony scales directly with your drone count.

I implemented this as a firmware app and a Python script using the work-in-progress cflib2 (running on Rust lib back-end). You can find the repository here, try it for yourself! Be aware that certain note combinations can cause the Crazyflie to move, flip, or take off unexpectedly.

Fitting music into 4 motors

The pipeline starts by parsing the MIDI file with mido. From there, an interactive track selection step shows you the instrument names, note counts, and ranges for each track so you can pick exactly which ones to include. The selected notes are then converted from MIDI note numbers into Hz frequencies that the motors can work with.

Each Crazyflie can only play 4 simultaneous voices (one per motor) so there’s some work involved in squeezing music into that constraint. I implemented a couple different voice allocation strategies: melodic priority, which keeps the bass and melody prioritized; voice stealing, which works like a LRU synth; and a simple round robin, which just assigns each new note to the next motor in turn, cutting off whatever was playing there. There’s also a frequency range problem to deal with: motors only reliably produce pitches in roughly the C4–B7 range, so notes outside that window get octave-shifted to fit.

Upload protocol

Events are packed into compact 6-byte structs containing a delta timestamp, motor index, an on/off flag, and the target frequency. These get streamed to the firmware app using the app channel in a simple START; events; END sequence. The Crazyflie app has a buffer limit of 5000 events, which effectively caps the length and complexity of what you can play. The 5000-event buffer was an arbitrary choice and you could probably get away with more, but it was enough for most songs I threw at it.

Synchronization

One of the trickier elements of this project was keeping Crazyflies synchronized. For starting in sync, I didn’t do anything special: no broadcast, no t-minus countdown, just sending start commands to each drone in sequence and relying on cflib2 to do it fast enough that the delay is negligible. That said, I’ve only tested with a small number of Crazyflies. With a larger fleet you’d probably need to implement something for the initial sync.

The real challenge is drift over time. The STM32’s crystal is rated at around 0.1% tolerance. This sounds tiny, but in the worst case, over a 1-minute song that’s already ~120 ms of drift between two drones. In a musical context, humans start noticing timing offsets around 20-30 ms; less for percussive sounds, and less for trained musicians. So left uncorrected, drift would become very audible well before the song ends.

To fix this, all clocks are reset to zero at song start. The host then periodically sends resync packets containing its own timestamp in microseconds, and each Crazyflie applies an offset correction to stay aligned, which as a bonus also irons out any initial start latency.

Rough edges

The biggest design constraint is that a single track can’t be split across Crazyflies, so if a track has more than 4 simultaneous voices, some get dropped. I thought of each Crazyflie as its own instrument, which made sense at the time, but it does mean a dense MIDI tracks can’t be split across multiple drones, which feels limiting in hindsight.

The usable pitch range is about 4 octaves (C4–B7), and propellers need to be attached for accurate pitch since the motors need load to produce the right frequencies, which makes the whole thing a bit unsafe. Certain note combinations can cause a drone to move, flip, or behave unpredictably. Only brushed motors are supported, and there’s a hard 71-minute per-song limit on clock sync. But honestly, if you’re sitting there listening to a 71-minute song on your Crazyflie, the clock drift is the least of your problems.

Check out crazyflie-jukebox on GitHub

This week we wanted to reflect on the progress that has been made lately in the Crazyflie ecosystem which will lead to bigger and better Crazyflie Swarms.

Radio communication

Like pointed out in the last blog post about Building a Crazyflie Flower Swarm with Rust, the new Rust Crazyflie library together with the new Crazyradio 2.0 has improved connection time and link efficiency by quite a bit.

It is now possible to connect swarms of multiple dozens of Crazyflies in seconds using a single radio and then make them fly while still getting position telemetry. So many Crazyflie on one radio does limit the maximum bandwidth per Crazyflie, but it does now work in a stable way!

Color LED deck

The recently released Color LED deck is a great addition to the ecosystem towards swarm. Its predecessor, the Led-ring Deck, has been used a lot by researchers to indicate state of individual Crazyflies in a Swarm. The Color LED Deck improves on that by providing a diffuser that allows to see the color from the side. This allows to mark states of big groups of Crazyflie much more clearly.

As a bonus, the Color LED Deck is very usable in other field like art and shows since it is much more visible and can be used to fly Crazyflies as “Flying Pixels”.

Autonomous landing and charging

Last year, we have released a Crazyflie 2.1 Brushless charging dock. This is a produced version of an idea we have been using with Crazyflie 2.1 and the Qi deck for years at fairs and conferences. It allows Crazyflies to autonomously land and charge. It is not only great for autonomous drone demos and shows but it also is a great waiting spots for swarms when doing research: the charging dock keeps the swarm charged so that when it is time to take off all the individuals starts with the same battery level.

Future endeavors

On the radio side there are still areas that would bring great improvement on communication stability. We are for example working on a channel-hopping communication protocol that should make the connection mostly immune to regular interference on 2.4GHz.

We are also working at improving other parts of swarm management, this includes for example solving the problem of flashing a full swarm of Crazyflie with the same firmware: we may be able to use broadcast messages more in order to drastically speed up the process instead of flashing the Crazyflie one per one.

Overall, working on bigger swarms allows us to work on the full stack and to make the Crazyflie a better drone for everybody.

With spring just around the corner, we thought it was the perfect excuse to make our Crazyflies bloom. The result is a small swarm demo where each drone flies a 3D flower trajectory, all coordinated from a single Crazyradio 2.0. This blog post walks through how it works and highlights two things that made it possible: the new Color LED deck and the Crazyflie Rust library.

The Color LED deck

There are two Color LED decks for the Crazyflie – one mounted on top and one on the bottom – each with its own individually controllable LED via the standard parameter interface. In this demo we use the one mounted on the bottom to give color to the flowers, along with the Lighthouse deck for accurate positioning in space.

The deck opens up a lot of creative possibilities for swarm demos as well as clear visual feedback about what each drone is doing.

Fast swarm connections with the Crazyflie Rust library

Getting five drones connected quickly on a single Crazyradio used to be a real bottleneck. The Crazyflie Rust library introduces a lazy-loading parameter system. Parameter values are not downloaded at connect time; instead, they are fetched only if the API user explicitly accesses them.

Additionally, caching the TOC (Table of Contents) makes it trivial to persist it locally and reuse it on every subsequent connection. In practice this means that after connecting to each drone once, all future connections are nearly instantaneous. The cache is keyed by the TOC’s CRC32 checksum, so it automatically stays valid as long as the firmware doesn’t change, and it’s identical between drones with the same checksum.

The library also uses Tokio’s async runtime, which means all Crazyflie connections start at the same time without waiting for each other. Combined with generally higher communication performance in the Rust implementation, these features significantly reduce the startup overhead, making the swarm feel reliable and responsive, which would require much more effort with the current Python library.

Generating the trajectories

The flower shapes are generated in Python using this script. It produces two .json files per drone (one stem{n} and one petals{n}) containing all the waypoints to fly through. The trajectories are then uploaded to the drone as compressed poly4d segments, a compact format that the Crazyflie’s onboard high-level commander can execute autonomously. Both trajectories are expressed relative to each drone’s initial position, so the formation geometry is entirely determined by where you place the drones on the ground before takeoff.

Putting them all together

The flight sequence is pretty straightforward:

1. Build the trajectories as waypoints on the host.

2. Connect to all drones simultaneously.

3. Upload each drone’s compressed trajectories in parallel.

4. Fly the trajectories while switching the LED colors.

Everything after the connection is driven by Tokio’s join_all, so the swarm stays in sync without any explicit synchronization logic – the drones are just given the same commands at the same time.

The full source code is available at this repository (Python for trajectory generation, Rust for flying).

We’re excited about where the Rust library is heading. It’s improving the communication with the Crazyflie and allows us to increase dramatically the number of Crazyflies per Crazyradio, leading to bigger and more reliable swarms. If you build something cool with it, let us know!

Bitcraze will exhibit at the European Robotics Forum 2026 March 23-27 in booth #90, where we will demonstrate a live, autonomous indoor flight setup based on the CrazyflieTM platform. The demonstration features multiple nano-drones flying autonomously in a controlled environment and reflects how the platform is used in research and applied robotics development.

Why Indoor Aaerial Testbeds Matter

The purpose of the demonstration is not the flight itself, but the role such setups play in validating aerial robotics concepts. Indoor, small-scale aerial systems allow researchers and R&D teams to study autonomy, perception, control, and multi-robot coordination under safe and repeatable conditions. This makes it possible to explore system behavior, test assumptions, and iterate rapidly before moving to larger platforms or less controlled environments.

Applicable in Both Academia and Industrial R&D

Bitcraze is used both in academic research and in industrial R&D contexts. In academia, the platform supports experimental work in areas such as swarm robotics, learning-based control, and human–robot interaction, and has been referenced in hundreds of peer-reviewed research papers worldwide. In industry, similar setups are increasingly used as testbeds to de-risk development by validating ideas indoors before scaling to outdoor testing, larger drones, or other robotic systems that require higher investment and operational complexity.

Hands-on Discussions at the Booth

At the booth, the live flight cage will be complemented by hands-on access to additional drones, expansion decks, and software tools. This allows for technical discussions around hardware architecture, sensing and positioning options, software stacks, and how different configurations support different research or development goals.

The Conversations We Are at ERF to Have

At ERF, Bitcraze is there to engage in conversations about platforms, testbeds, and how ambitious aerial robotics ideas can be validated in a financially responsible, safe, and controlled manner. This includes discussions with academic groups, industrial R&D teams, and project partners working across the research-to-application spectrum.

Looking forward to the discussions in Stavanger in booth #90!

Send us a message to contact@bitcraze.io to book a meeting at the show!

The CrazyflieTM Color LED deck is a high-powered, fully programmable RGB(W) lighting expansion for the Crazyflie 2.x platform —and it’s now available in the shop.

It delivers bright, diffused, and uniform light suitable for research, teaching, vision experiments, and indoor drone choreography. The deck mounts on top or bottom of the Crazyflie and integrates seamlessly through our open-source firmware and I²C-based deck interface.

Two Versions, Same Electronics

The Color LED deck is available in two distinct versions, each sold as a separate product.

Top-Mounted Color LED Deck

Designed to be placed on top of the Crazyflie. Ideal for scenarios where the drone is viewed from above or when customers need to use positioning or sensor decks underneath the Crazyflie—such as the Flow Deck or other bottom-mounted modules.

  • Works well with motion capture (MoCap) systems using ceiling-mounted cameras.
  • Not recommended with the Lighthouse positioning deck, due to optical occlusion and interference with the lighthouse sensors.

You can find the top-mounted version in the store here

Bottom-Mounted Color LED Deck

This version is suitable for almost all use cases, offering maximum visibility from below and minimal interference with other decks.

  • Ideal for Lighthouse positioning (no optical obstruction).
  • Ideal for MoCap positioning, especially when cameras view from multiple angles

You can find the bottom-mounted version in the store here.

Dual Mounting for Maximum Visibility

Additionally, two Color LED decks can be mounted both above and underneath the Crazyflie simultaneously, creating a strong, uniform light signature visible from all directions. This is best for MoCap environments where multi-angle visibility improves marker/camera performance.

You can see all three variants of the Color LED in action in our latest Christmas video, created in collaboration with Learning Systems and Robotics Lab :

Each color LED deck variant operates independently, allowing the top and bottom decks to be configured with different colors if desired. While all variants share the same electronics, diffuser, and firmware behavior, their physical mounting positions let you choose the setup that best fits your lab, show environment, or positioning needs.

Diffuser now available

While designing the Color LED deck, we also created a light diffuser—now available in the shop to buy as a standalone product. It is designed to be compatible with the LED ring to spread and soften the light, extending visibility and improving appearance.

You can find the light diffuser in the store here.

The Color LED deck is available now in both versions. Head to the store to order, or contact us for a quote!

Today, we rejoin with Maurice Zemp who presented his work in an earlier blogpost.

Road to the Finals

I had officially completed my Matura thesis in October 2024 and submitted it to the Schweizer Jugend forscht competition. When I was selected for the semifinals, I was given the chance to present my work in front of a jury. Their feedback was highly constructive and came with clear requirements: for the finals, I would need to provide more in-depth analyses of the individual system components of my project. At first, this felt like a challenge, but in the process I realized how much these refinements elevated my research. By the time the finals approached, I felt both nervous and proud, knowing that the work I would present had grown far beyond the version I had initially submitted. On April 24, 2025, the big moment finally arrived – the start of the national finals.

Fig. 8: The Location of the semifinals

Day 1

The day began with my journey to ETH Zurich. Traveling by public transport, I carried my Crazyflie drone and the racing gate with me – equipment that had accompanied me throughout countless hours of development and testing and with which I wanted to make the comprehension of my project a bit more feasible. Arriving at ETH, I was greeted warmly at the reception, where I first felt a sense of belonging among dozens of passionate and curious young scientists.

Fig. 9: My booth at ETH

The morning was dedicated to setting up our booths. Piece by piece, the exhibition hall transformed into a vibrant space filled with prototypes, posters, and creative ideas. Once my own stand was ready, I finally had a moment to take in the atmosphere and to start the first conversations. In the afternoon, we were treated to a guided city tour through Zurich. Walking through the old streets, hearing stories about the city, and enjoying the fresh air was the perfect opportunity to get to know the other participants better.

Fig. 10: The Limmat (River in Zurich)
Fig. 11: The Grossmünster church in Zurich

Later that day, alumni of Schweizer Jugend forscht visited the exhibition. For the first time, I had the chance to present my project outside of the jury context, and I was surprised by the interest and thoughtful questions I received.

By the time we arrived at our youth hostel late in the evening, the excitement of the day had fully caught up with me. Exhausted but exhilarated, I fell into bed.

Day 2

The second day began with breakfast at the youth hostel, followed by a short tram ride back to ETH Zurich. The morning program was dedicated to the jury sessions, which represented one of the most important parts of the entire competition. Unlike in the semifinals, where I just explained my project and was asked some general questions, this time I was able to discuss my project in detail with several experts – including those specializing in fields beyond my own topic.

Fig. 12: The ETH Main Hall filled with interesting projects

These conversations quickly turned into fascinating discussions. The jurors asked insightful questions, challenged certain assumptions, and encouraged me to think more deeply about the potential applications of my work. At the same time, I received a great deal of praise, which was both reassuring and motivating. It was incredibly rewarding to see that months of effort, refinement, and problem-solving were being recognized by experienced professionals.

In the afternoon, the doors of ETH opened to the public for the exhibition. Friends, family members, and curious visitors from outside the competition came to explore the stands. Presenting my project in this setting felt very different from the formal jury discussions of the morning – it was more relaxed, conversational, and filled with spontaneous questions. I especially enjoyed seeing how people unfamiliar with drone technology reacted to the project, and it gave me the chance to practice explaining complex ideas in a way that was accessible to everyone. After such a full day of interactions, we returned to the youth hostel in the evening. The atmosphere there was much calmer, as everyone tried to recharge a little energy in preparation for the final day.

Fig. 13: Me explaining my project to a friend of mine who came to visit me

Day 3

The final day once again began with our journey to ETH Zurich. In the morning, the exhibition hall opened its doors for a second round of public visits. This time, the experience was especially meaningful for me, as my family came to see my project in person.

After lunch, it was finally time for the highlight of the competition: the award ceremony. A live band set the stage, and soon the opening speeches began. The tension in the room was almost tangible – every participant knew that months, if not years, of work were culminating in this single event. I felt both nervous and excited, my heart beating faster with each passing moment.

Fig. 14: The start of the award ceremony
Fig. 15: My nomination to ISEF 2026!

Then came an unexpected twist: even before the regular prizes and certificates were announced, the jury revealed the winners of the most prestigious special awards. To my immense joy, my name was called. I had been selected to represent Switzerland at the International Science and Engineering Fair (ISEF) 2026 in Phoenix, Arizona. The sense of relief, excitement, and pride I felt in that moment is difficult to describe – it was a dream come true.

The ceremony continued with an inspiring keynote by former NASA Director Thomas Zurbuchen, who shared his journey in science and reminded us of the importance of perseverance and never giving up.

Fig. 16: An inspring talk by Thomas Zurbuchen
Fig. 17: Me being awarded my distinction

Finally, the time came for the official certificates. One by one, every participant was called to receive their recognition. When my turn came, I was awarded the highest possible distinction: hervorragend (outstanding) honored with CHF 1500. The applause and congratulations that followed made the moment even more unforgettable.

The evening concluded with an apéro, where I had the chance to exchange thoughts with professors, fellow participants, and many guests. I was overwhelmed by the warm words of encouragement and congratulations I received for both my project and the recognition it had achieved.

After three exciting, inspiring, and at times exhausting days, it was finally time to return home – this time together with my parents, carrying not only my luggage but also an experience that I will cherish for a lifetime.

Hi everyone!

My name is Alex, and I am thrilled to be joining the Bitcraze team as an intern!

Currently, I am a Doctoral Researcher at the Institute for Data Science in Mechanical Engineering at RWTH Aachen University, following my studies in mechatronics at TU Darmstadt. My research interests lie in swarm robotics, distributed embedded AI, and controlling technical systems using machine learning.

What drew me to Bitcraze is the unique potential of the platforms developed here; in fact, I have used them in my research for over four years now! I am eager to apply my research background to real-world hardware and finally see “behind the curtains” of the company that produces the research platform I work with daily.

I am looking forward to digging into the code, collaborating with the team, and engaging with the community. If you have any thoughts on swarm behaviors or machine learning you’d like to discuss, please reach out!

Last week, Bitcraze attended the BETT Show in London to get a better sense of how the education landscape is evolving.

BETT (British Educational Technology Show) brings together educators, edtech companies, curriculum developers, policymakers, and technology providers across the full spectrum of learning: from primary school to higher education and professional training.

For us, it was a valuable opportunity to listen and get an understanding of where the general EDU landscape is and where it is heading.

Meeting Familiar Faces, and New Ones

One of the most rewarding parts of the visit was reconnecting with existing partners already using the CrazyflieTM in educational settings, and meeting new potential collaborators: teachers building robotics programs, universities modernising their lab infrastructure, and organisations developing national STEM (Science, Technology, Engineering, and Mathematics) and STEAM (Science, Technology, Engineering, Art, and Mathematics) initiatives.

A recurring theme in many conversations was the need for platforms that are robust and safe to use in classrooms, scale from simple programming exercises to advanced autonomy and AI, support both structured teaching and open-ended experimentation, and are well documented (both for the teacher and for the student).

These are exactly the problems we have spent more than a decade working on.

What the Education Robotics Market Looks Like Today

Speaking with a wide range of robotics vendors, software providers, and solution integrators gave us a clearer picture of the realities of the K-12 and STEM market:

  • Procurement is often tender-based and highly structured
  • Budgets are tight and price sensitivity is real
  • There are many vendors offering similar-looking robotics kits
  • Hardware is physically robust and classroom-proof and safety is critical
  • Programming is dominated by Python, Scratch, Blockly, or proprietary visual tools
  • “AI-enabled” frequently means GPT-style programming blocks layered on top
  • LEGO compatibility is everywhere
  • micro:bit has effectively become a compelling entry-level control board
  • Buyers apply hard scrutiny to educational value and learning outcomes
  • Real adoption requires curricula, lesson plans, and teacher training programs
  • And in practice, U.S.-developed curricula often transfer reasonably well globally

Why the Crazyflie is a Great Fit for Education

Although the Crazyflie originated as a research platform, its characteristics map naturally to education:

STEM / STEAM (Upper Secondary & High School)

Students can work hands-on with control systems, sensors, wireless communication, programming, and basic AI in a physical system they can see, debug, and iterate on. It makes abstract concepts tangible.

Undergraduate Education

Crazyflie is increasingly used in robotics, embedded systems, and mechatronics courses to teach estimation, control, perception, and multi-agent systems without the overhead of large and expensive hardware.

Post-graduate Research

This remains our strongest domain: swarm robotics, learning-based control, human–robot interaction, indoor navigation, and distributed systems.

The continuity matters. Students don’t outgrow the platform. They grow with it. And, more importantly, the same openness that researchers value is increasingly relevant in education as well (particularly relevant in the light of recent geopolitical movements). Institutions want transparency, long-term maintainability, and the freedom to adapt tools to their pedagogy and not just consume closed kits.

Education is a Strategic Part of the Robotics Ecosystem

BETT confirmed that education is a strategic and structured part of the robotics ecosystem. Not just as “learning about robots”, but as a way to train future engineers, researchers, and system designers using realistic platforms from an early age.

Succeeding in this segment requires more than good hardware. It requires thoughtful packaging, clear educational positioning, proper teaching material, partner ecosystems, and long-term commitment.

To those we met at BETT, thank you for the conversations. And if you are working with STEM, STEAM, or robotics education and are curious about the Crazyflie, we are always happy to talk.