Please use this identifier to cite or link to this item:
https://hdl.handle.net/2440/88025
Citations | ||
Scopus | Web of Science® | Altmetric |
---|---|---|
?
|
?
|
Type: | Conference paper |
Title: | A dynamic programming approach to reconstructing building interiors |
Author: | Flint, A. Mei, C. Murray, D. Reid, I. |
Citation: | Lecture Notes in Artificial Intelligence, 2010 / Daniilidis, K., Maragos, P., Paragios, N. (ed./s), vol.6315 LNCS, iss.PART 5, pp.394-407 |
Publisher: | Springer |
Publisher Place: | Berlin, Heidelberg |
Issue Date: | 2010 |
Series/Report no.: | Lecture Notes in Computer Science; 6315 |
ISBN: | 3642155545 9783642155543 |
ISSN: | 0302-9743 1611-3349 |
Conference Name: | European Conference on Computer Vision (ECCV) (5 Sep 2010 - 11 Sep 2010 : Heraklion, Crete, Greece) |
Editor: | Daniilidis, K. Maragos, P. Paragios, N. |
Statement of Responsibility: | Alex Flint, Christopher Mei, David Murray, and Ian Reid |
Abstract: | A number of recent papers have investigated reconstruction under Manhattan world assumption, in which surfaces in the world are assumed to be aligned with one of three dominant directions [1,2,3,4]. In this paper we present a dynamic programming solution to the reconstruction problem for “indoor” Manhattan worlds (a sub–class of Manhattan worlds). Our algorithm deterministically finds the global optimum and exhibits computational complexity linear in both model complexity and image size. This is an important improvement over previous methods that were either approximate [3] or exponential in model complexity [4]. We present results for a new dataset containing several hundred manually annotated images, which are released in conjunction with this paper. |
Rights: | © Springer-Verlag Berlin Heidelberg 2010 |
DOI: | 10.1007/978-3-642-15555-0_29 |
Published version: | http://dx.doi.org/10.1007/978-3-642-15555-0_29 |
Appears in Collections: | Aurora harvest 2 Computer Science publications |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.