Please use this identifier to cite or link to this item:
|Title:||Incremental Kernel PCA for Efficient Non-linear Feature Extraction|
|Citation:||Proceedings of the 17th British Machine Vision Conference, Edinburgh, U.K., 2006: pp.939-948|
|Publisher:||The british Machine Vision Association|
|Conference Name:||British Machine Vision Conference (17th : 2006 : Edinburgh)|
|Tat-Jun Chin and David Suter|
|Abstract:||The Kernel Principal Component Analysis (KPCA) has been effectively applied as an unsupervised non-linear feature extractor in many machine learning applications. However, with a time complexity of O(n3), the practicality of KPCA on large datasets is minimal. In this paper, we propose an approximate incremental KPCA algorithm which allows efficient processing of large datasets. We extend a linear PCA updating algorithm to the non-linear case by utilizing the kernel trick, and apply a reduced set construction method to compress expressions for the derived KPCA basis at each update. In addition, we show how multiple feature space vectors can be compressed efficiently, and how approximated KPCA bases can be re-orthogonalized using the kernel trick. The proposed method is justified through experimental validations.|
|Appears in Collections:||Computer Science publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.