UKF-based visual tracking with eye-in-hand camera

Yiping CHEN, Xiang LI, Hesheng WANG, Qingyun LI

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

Abstract

This paper presents a new image-based visual tracking method for a robot manipulator to trace a moving target using monocular camera mounted on the end-effector. The intrinsic and extrinsic parameters of the camera are already known. In order to estimate the 3D position of the rigid body target, three points (not collinear) from the rigid body were selected as reference. We assumed that these three points will never be occluded by the rigid body itself. Based on this assumption, we employ a new point adaptive Unscented Kalman Filter (UKF) algorithm to track the rigid body and estimate its motion in real-Time. Simulation and experimental results are included to illustrate the performance of the proposed method on a 3-degree-of-freedom (DOF) robot manipulator. 

Original languageEnglish
Title of host publication38th International Conference on Computers and Industrial Engineering 2008
PublisherCurran Associates Inc
Pages2599-2605
Number of pages7
ISBN (Print)9781627486828
Publication statusPublished - 2008
Externally publishedYes

Keywords

  • Monocular camera
  • UKF
  • Visual tracking

Fingerprint

Dive into the research topics of 'UKF-based visual tracking with eye-in-hand camera'. Together they form a unique fingerprint.

Cite this