Divide and conquer: efficient density-based tracking of 3D sensors in Manhattan worlds
Date
2017
Authors
Zhou, Y.
Kneip, L.
Rodriguez Opazo, C.
Li, H.
Editors
Lai, S.H.
Lepetit, V.
Nishino, K.
Sato, Y.
Lepetit, V.
Nishino, K.
Sato, Y.
Advisors
Journal Title
Journal ISSN
Volume Title
Type:
Conference paper
Citation
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2017 / Lai, S.H., Lepetit, V., Nishino, K., Sato, Y. (ed./s), vol.10115 LNCS, pp.3-19
Statement of Responsibility
Shang-Hong Lai, Vincent, Lepetit, Ko Nishino, Yoichi Sato
Conference Name
13th Asian Conference on Computer Vision (20 Nov 2016 - 24 Nov 2016 : Taipei)
Abstract
3D depth sensors such as LIDARs and RGB-D cameras have
become a popular choice for indoor localization and mapping. However,
due to the lack of direct frame-to-frame correspondences, the tracking
traditionally relies on the iterative closest point technique which does
not scale well with the number of points. In this paper, we build on top
of more recent and efficient density distribution alignment methods, and
notably push the idea towards a highly efficient and reliable solution
for full 6DoF motion estimation with only depth information. We propose
a divide-and-conquer technique during which the estimation of the
rotation and the three degrees of freedom of the translation are all decoupled
from one another. The rotation is estimated absolutely and driftfree
by exploiting the orthogonal structure in man-made environments.
The underlying algorithm is an efficient extension of the mean-shift paradigm
to manifold-constrained multiple-mode tracking. Dedicated projections
subsequently enable the estimation of the translation through
three simple 1D density alignment steps that can be executed in parallel.
An extensive evaluation on both simulated and publicly available real
datasets comparing several existing methods demonstrates outstanding
performance at low computational cost.
School/Discipline
Dissertation Note
Provenance
Description
Access Status
Rights
© Springer International Publishing AG 2017