Interactive markerless articulated hand motion tracking using RGB and depth data
Proceedings of the IEEE international conference on computer vision, 2013•cv-foundation.org
Tracking the articulated 3D motion of the hand has important applications, for example, in
human-computer interaction and teleoperation. We present a novel method that can capture
a broad range of articulated hand motions at interactive rates. Our hybrid approach
combines, in a voting scheme, a discriminative, part-based pose retrieval method with a
generative pose estimation method based on local optimization. Color information from a
multiview RGB camera setup along with a person-specific hand model are used by the …
human-computer interaction and teleoperation. We present a novel method that can capture
a broad range of articulated hand motions at interactive rates. Our hybrid approach
combines, in a voting scheme, a discriminative, part-based pose retrieval method with a
generative pose estimation method based on local optimization. Color information from a
multiview RGB camera setup along with a person-specific hand model are used by the …
Abstract
Tracking the articulated 3D motion of the hand has important applications, for example, in human-computer interaction and teleoperation. We present a novel method that can capture a broad range of articulated hand motions at interactive rates. Our hybrid approach combines, in a voting scheme, a discriminative, part-based pose retrieval method with a generative pose estimation method based on local optimization. Color information from a multiview RGB camera setup along with a person-specific hand model are used by the generative method to find the pose that best explains the observed images. In parallel, our discriminative pose estimation method uses fingertips detected on depth data to estimate a complete or partial pose of the hand by adopting a part-based pose retrieval strategy. This part-based strategy helps reduce the search space drastically in comparison to a global pose retrieval strategy. Quantitative results show that our method achieves state-of-the-art accuracy on challenging sequences and a near-realtime performance of 10 fps on a desktop computer.
cv-foundation.org
以上显示的是最相近的搜索结果。 查看全部搜索结果