My name is Joseph La Delfa, I started in December last year as an industrial postdoc at Bitcraze. This means I work across Bitcraze HQ in Malmö and the division of Robotics Perception and Learning at KTH in Stockholm. I have been designing and researching interactions that involve bodily control of drones for a few years now.
Here at Bitcraze I will be using lighthouse decks on the body to control Crazyflie/s, with the aim to produce a wearable sensor that integrates into the Bitcraze ecosystem. The lighthouse positioning system is showing great potential for this application as it gives relatively clean and high-fidelity positional data. Plus now that more than two base stations can be used, we can potentially track the deck from top down and bottom up.
This research is a continuation of my PhD thesis, where I designed drone interactions that respond differently to different people, and advocated for a human-drone relationship that evolves over time. Ultimately I hope to demonstrate new use cases for the Crazyflie and expand the already impressive community of researchers who use Bitcraze products!
This industrial postdoc is funded by the Wallenberg AI, Autonomous Systems and Software Program (WASP), and you can find out more about my work at www.cafeciaojoe.com :D
This week we have a guest blog post from Joseph La Delfa.
DroneChi is a Human Drone interaction experience that uses the Qualisys motion capture system that enables the Crazyflie to react to movements of your body. At the Exertion Games Lab in Melbourne Australia, we like to design new experiences with technology where the whole body can be the controller and is involved in the experience.
When we first put these two technologies together we realised two things.
It was super easy to keep your attention on a the drone as it flew around the room reacting to your movements.
As a result it was also really easy to reflect on and refine ones own movements.
We thought this was like meditation meditated by a drone, and wanted to investigate how to further enhance this experience through design. We thought the smooth movements were especially mesmerising and so I decided to take beginner Tai Chi lessons; to get an appreciation of what it felt like to move like a Tai Chi student.
We undertook an 8 month design program where we simultaneously designed the form and the interaction of the Crazyflie. The initial design brief was pretty simple, make it look and feel light, graceful and from nature. In Tai Chi you are asked all the time to imagine a flower, the sea or a bird as you embody its movements, we wanted to emulate these experiences but without verbal instruction. Could a drone facilitate these sorts of experiences through it’s design?
We will present a summarised version of how the form and the interaction came about. Starting with a mood board, we collated radially symmetrical forms from nature to match a drone’s natural weight distribution.
We initially went with a jelly fish, hoping to emulate their “push gliiide” movement by articulating laser cut silhouettes (see fig c). This proved incredibly difficult, after searching high and low for a foam that was light enough for the Crazyflie to lift, we just could not get it to fly stable.
However, we serendipitously fell into the flower shape by trying to improve how we joined the carbon rods together in a loop (fig b below). By joining them to the main hull we realised it looked like a petal! This set us down the path of the flower, we even flipped the chassis so that the LED ring faced upwards (cheers to Tobias for that firmware hack).
Whilst this was going on we were experimenting with how to actually interact with the drone. Considering the experience was to be demonstrated at a major conference we decided to keep the tracking only to the hands, this allowed quick change overs. We started with cardboard pads, experimented with gloves but settled on some floral inspired 3D printed pads. We were so tempted to include the articulation of the fingers but decided against it to avoid scope creep! Further to this, we curved the final hand pads (fig d) to promote the idea of holding the drone, inspired by a move in Tai Chi called “holding the ball”.
As a beginner practicing Tai Chi I was sometimes overwhelmed by the number of aspects of my movement that constantly needed monitoring, palms out, heel out, elbow slightly bent, step forward etc. However in brief moments it all came together and I was able to appreciate the feelings of these movements as opposed to consciously monitoring them. We wanted this kind of experience when learning DroneChi so we devised a way of mapping the drone to the body to emulate this. After a few iterations we settled on the “mid point” method as seen below.
The drone only followed the midpoint (blue dot above) if it was within .2m of it. If it was outside of this range it would float away slowly from the participant. This may seem like a lot, but with little in the way of visual guidance (eg a laser pointer or an augmented display) a person can only rely on the proprioceptive feedback from their own body. We used the on board LED ring on the drone to let the person know at least when they are close, but that is all the help they got. As a result this takes a lot of concentration to get right!
In the end we were super happy with the final experience, in the study participants reported tuning into their bodies when using the drone, as well as experiencing a unique sort of relationship to the drone; not entirely like a pet and also like an extension of the body. We will be investigating both findings from the study through the design and testing of a new system on the Crazyflie. We see this work contributing to more intimate designs for human drone interactions as well as a being applicable to health contexts such as rehabilitation.