Geometrically consistent plane extraction for dense indoor 3D maps segmentation

Files

RA_hdl_105568.pdf (5.24 MB)
  (Restricted access)

Date

2016

Authors

Pham, T.
Eich, M.
Reid, I.
Wyeth, G.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2016, vol.2016-November, pp.4199-4204

Statement of Responsibility

Trung T. Pham, Markus Eich, Ian Reid, Gordon Wyeth

Conference Name

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016) (9 Oct 2016 - 14 Oct 2016 : Daejeon, SOUTH KOREA)

Abstract

Modern SLAM systems with a depth sensor are able to reliably reconstruct dense 3D geometric maps of indoor scenes. Representing these maps in terms of meaningful entities is a step towards building semantic maps for autonomous robots. One approach is to segment the 3D maps into semantic objects using Conditional Random Fields (CRF), which requires large 3D ground truth datasets to train the classification model. Additionally, the CRF inference is often computationally expensive. In this paper, we present an unsupervised geometric-based approach for the segmentation of 3D point clouds into objects and meaningful scene structures. We approximate an input point cloud by an adjacency graph over surface patches, whose edges are then classified as being either on or off. We devise an effective classifier which utilises both global planar surfaces and local surface convexities for edge classification. More importantly, we propose a novel global plane extraction algorithm for robustly discovering the underlying planes in the scene. Our algorithm is able to enforce the extracted planes to be mutually orthogonal or parallel which conforms usually with human-made indoor environments. We reconstruct 654 3D indoor scenes from NYUv2 sequences to validate the efficiency and effectiveness of our segmentation method.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

©2016 IEEE

License

Call number

Persistent link to this record