Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Type: Journal article
Title: Efficiently learning a detection cascade with sparse eigenvectors
Author: Shen, C.
Paisitkriangkrai, S.
Zhang, J.
Citation: IEEE Transactions on Image Processing, 2011; 20(1):22-35
Publisher: IEEE-Inst Electrical Electronics Engineers Inc
Issue Date: 2011
ISSN: 1057-7149
Statement of
Chunhua Shen, Sakrapee Paisitkriangkrai and Jian Zhang
Abstract: Real-time object detection has many computer vision applications. Since Viola and Jones [1] proposed the first real-time AdaBoost based face detection system, much effort has been spent on improving the boosting method. In this work, we first show that feature selection methods other than boosting can also be used for training an efficient object detector. In particular, we introduce greedy sparse linear discriminant analysis (GSLDA) [2] for its conceptual simplicity and computational efficiency; and slightly better detection performance is achieved compared with [1]. Moreover, we propose a new technique, termed boosted greedy sparse linear discriminant analysis (BGSLDA), to efficiently train a detection cascade. BGSLDA exploits the sample reweighting property of boosting and the class-separability criterion of GSLDA. Experiments in the domain of highly skewed data distributions (e.g., face detection) demonstrate that classifiers trained with the proposed BGSLDA outperforms AdaBoost and its variants. This finding provides a significant opportunity to argue that AdaBoost and similar approaches are not the only methods that can achieve high detection results for real-time object detection.
Keywords: AdaBoost
cascade classifier
greedy sparse linear discriminant analysis (GSLDA)
object detection.
Rights: © 2010 IEEE
DOI: 10.1109/TIP.2010.2055880
Grant ID: ARC
Published version:
Appears in Collections:Aurora harvest
Computer Science publications

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.