Abstract
Pursuit robots (autonomous robots tasked with tracking and pursuing a moving target) require accurate tracking of the target’s position over time. One possibly effective pursuit platform is a quadcopter equipped with basic sensors and a monocular camera. However, the combined noise in the quadcopter’s sensors causes large disturbances in the target’s 3D position estimate. To solve this problem, in this paper, we propose a novel method for joint localization of a quadcopter pursuer with a monocular camera and an arbitrary target. Our method localizes both the pursuer and target with respect to a common reference frame. The joint localization method fuses the quadcopter’s kinematics and the target’s dynamics in a joint state space model. We show that predicting and correcting pursuer and target trajectories simultaneously produces better results than standard approaches to estimating relative target trajectories in a 3D coordinate system. Our method also comprises a computationally efficient visual tracking method capable of redetecting a temporarily lost target. The efficiency of the proposed method is demonstrated by a series of experiments with a real quadcopter pursuing a human. The results show that the visual tracker can deal effectively with target occlusions and that joint localization outperforms standard localization methods.
Original language | English |
---|---|
Pages (from-to) | 613-630 |
Number of pages | 18 |
Journal | Journal of Intelligent and Robotic Systems: Theory and Applications |
Volume | 78 |
Issue number | 3-4 |
DOIs | |
Publication status | Published - 16 Jun 2015 |
Externally published | Yes |
Keywords
- AR.Drone
- Backprojection
- Joint localization
- Monocular cues
- Pursuit robot
- Quadcopters
- Redetection
- State estimation filters
- Visual tracking