Dense multibody motion estimation and reconstruction from a handheld camera

Date

2012

Authors

Roussos, A.
Russell, C.
Garg, R.
Agapito, L.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2012), 2012, pp.31-40

Statement of Responsibility

Anastasios Roussos, Chris Russell, Ravi Garg, Lourdes Agapito

Conference Name

2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2012) (5 Nov 2012 - 8 Nov 2012 : Atlanta, Georgia)

Abstract

Existing approaches to camera tracking and reconstruction from a single handheld camera for Augmented Reality (AR) focus on the reconstruction of static scenes. However, most real world scenarios are dynamic and contain multiple independently moving rigid objects. This paper addresses the problem of simultaneous segmentation, motion estimation and dense 3D reconstruction of dynamic scenes. We propose a dense solution to all three elements of this problem: depth estimation, motion label assignment and rigid transformation estimation directly from the raw video by optimizing a single cost function using a hill-climbing approach. We do not require prior knowledge of the number of objects present in the scene - the number of independent motion models and their parameters are automatically estimated. The resulting inference method combines the best techniques in discrete and continuous optimization: a state of the art variational approach is used to estimate the dense depth maps while the motion segmentation is achieved using discrete graph-cut based optimization. For the rigid motion estimation of the independently moving objects we propose a novel tracking approach designed to cope with the small fields of view they induce and agile motion. Our experimental results on real sequences show how accurate segmentations and dense depth maps can be obtained in a completely automated way and used in marker-free AR applications.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

©2012 IEEE

License

Grant ID

Call number

Persistent link to this record