Publication | Closed Access
Hand Gesture-based Wearable Human-Drone Interface for Intuitive Movement Control
22
Citations
15
References
2019
Year
Unknown Venue
EngineeringField RoboticsWearable TechnologyFlying RobotMotor ControlGazebo SimulatorKinesiologyMotion CaptureUnmanned SystemRaspberry ZeroRobot LearningKinematicsGesture ProcessingHealth SciencesIntuitive Movement ControlGesture RecognitionAerial RoboticsAerospace EngineeringRadio ControlHuman MovementRobotics
Although Radio Control (RC) has been a dominant device for controlling a drone, it is known that a fair amount of training period is required to master it. One way to sidestep such an RC-based control scheme would be utilizing either Kinect or Leap Motion sensor by which the user interacts with a drone more naturally. In such cases, however, the pilot has to hang around the sensor since the operating distance of such sensors is rather short. In this study, we propose a new wearable human-drone interface embedded on a Raspberry Zero, by which even a novice can let the drone not only take-off, land and fly to the intended directions according his hand-pose gestures but also make diverse flying trajectories such as circle, square and spiral using a sequence of hand gestures. Results from Gazebo simulator and several field experiments combined with a personalized calibration program demonstrate the feasibility of its commercial applications.
| Year | Citations | |
|---|---|---|
Page 1
Page 1