Drones can be invaluable tools during disaster response operations, as they can both expand the situational awareness of the response team by providing a mobile and flexible airborne camera, and increase their safety by going into dangerous locations in their stead. Controlled by a skilled operator, a drone can help detect both victims and hazards faster, and provide vantage points ranging from a broad, bird’s eye view of the area to a close-up of specific and possibly inaccessible locations.
Conventional drone piloting has two main drawbacks: a) it requires a somewhat cumbersome remote controller, which takes both of the pilot’s hands to operate; and b) it requires a lot of training to fly proficiently. Conventional remote controllers are not particularly intuitive in their operation, as one hand controls pitch and roll (horizontal movement) and the other yaw and throttle (rotation and vertical movement, respectively).
Disaster response situations are naturally stressful and often dangerous. Unintuitive controls and having both hands occupied with a single, continuous task is, at best, not ideal and could contribute to increased stress and risk. With these considerations, FASTER has designed a single-handed gesture control drone-control system that promises to be more intuitive and easier to learn, while leaving one hand free for other tasks.
FASTER’s gesture control includes two modes:
- With palm-based navigation, users use their open hand and the drone mimics the palm’s orientation and lateral movement. A horizontal palm corresponds to a stationary drone. Pitch your hand forward and the drone moves forward; rotate it to the left and the drone will start yawing that way as long as your hand holds that position. Hold your hand a little higher and the drone will ascend, too. This is a very intuitive mode of control, as users can look at their hand and imagine it to be the drone.
- For more precise motions, pilots can use finger-based navigation. This is a set of three simple gestures, corresponding to “up/down”, “rotate left/right”, and “forward”. Gestures involve two extended fingers, different for each, while the remaining three fingers are curled. Speed an direction are controlled by a finger’s angle. For example, thumb and index finger extended (a pistol-like shape) correspond to “up/down”, with the index finger controlling speed and direction. Point the pistol-shape up to ascend and down to descent. The higher you point, the faster the drone will rise. Point straight to the sky for a fast ascend, or a little over the horizon for a gradual one.
Gesture control has been implemented for three different hardware sets: the LeapMotion infrared peripheral; webcam; and the Microsoft HoloLens. It has been connected to a drone simulator for training, as well as with real drones for testing in actual flight.
Early versions of the tool were tested with local first responders in Thessaloniki, first in simulation and later with a real drone, and in the first Greek pilot in Athens, on October 2, 2020.