Technical Papers
Flycon: Real-time Environment-independent Multi-view Human Pose Estimation with Aerial Vehicles
Event Type
Technical Papers
Registration Categories
TimeWednesday, 5 December 20184:41pm - 5:07pm
DescriptionWe propose a real-time method for the infrastructure-free estimation of
articulated human motion. The approach leverages a swarm of cameraequipped
flying robots and jointly optimizes the swarm’s and skeletal states,
which include the 3D joint positions and a set of bones. Our method allows
to track the motion of human subjects, for example an athlete, over long time
horizons and long distances, in challenging settings and at large scale, where
fixed infrastructure approaches are not applicable. The proposed algorithm
uses active infra-red markers, runs in real-time and accurately estimates
robot and human pose parameters online without the need for accurately
calibrated or stationary mounted cameras. Our method i) estimates a global
coordinate frame for the MAV swarm, ii) jointly optimizes the human pose
and relative camera positions, and iii) estimates the length of the human
bones. The entire swarm is then controlled via a model predictive controller
to maximize visibility of the subject from multiple viewpoints even under
fast motion such as jumping or jogging. We demonstrate our method in a number of difficult scenarios including capture of long locomotion sequences at the scale of a triplex gym, in non-planar terrain, while climbing and in outdoor scenarios.