Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Type: Conference paper
Title: Place categorization using sparse and redundant representations
Author: Carrillo, H.
Latif, Y.
Neira, J.
Castellanos, J.
Citation: Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2014, pp.4950-4957
Publisher: IEEE
Issue Date: 2014
Series/Report no.: IEEE International Conference on Intelligent Robots and Systems
ISBN: 978-1-4799-6934-0
ISSN: 2153-0858
Conference Name: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014) (14 Sep 2014 - 18 Sep 2014 : Chicago, IL, USA)
Statement of
Henry Carrillo, Yasir Latif, José Neira and José A. Castellanos
Abstract: Place categorization addresses the problem of determining the semantic label of the current position of a robot, given a snapshot of the environment as well as previously labeled information about different places that the robot has already seen. State-of-the-art approaches use machine learning techniques that require extensive and often time consuming training. This work proposes a novel formulation by posing place categorization as an efficient `1-minimization problem, leading to both a faster training phase and to performance comparable to state-of-the-art methods. The formulation allows online robot operation particularly in the case when the training phase has to be learned on-the-fly and in an active manner. To validate the performance of the proposed method, extensive experimental results carried out on real data under different lighting conditions as well as structural changes in the environment are provided.
Keywords: Dictionaries, vectors, robots
Rights: © 2014 IEEE
DOI: 10.1109/IROS.2014.6943266
Appears in Collections:Aurora harvest 8
Computer Science publications

Files in This Item:
File Description SizeFormat 
  Restricted Access
Restricted Access367.52 kBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.