BoostML: An adaptive metric learning for nearest neighbor classification
Date
2010
Authors
Zaidi, N.
Squire, D.
Suter, D.
Editors
Zaki, M.J.
Yu, J.X.
Ravindran, B.
Pudi, V.
Yu, J.X.
Ravindran, B.
Pudi, V.
Advisors
Journal Title
Journal ISSN
Volume Title
Type:
Conference paper
Citation
Proceedings of the 14th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD 2010), held in Hyderabad, India, 21-24 June 2010: pp.142-149
Statement of Responsibility
Nayyar Abbas Zaidi, David McG. Squire and David Suter
Conference Name
Pacific-Asia Conference on Knowledge Discovery and Data Mining (14th : 2010 : Hyderabad, India)
Abstract
A Nearest Neighbor (NN) classifier assumes class conditional probabilities to be locally smooth. This assumption is often invalid in high dimensions and significant bias can be introduced when using the nearest neighbor rule. This effect can be mitigated to some extent by using a locally adaptive metric. In this work we propose an adaptive metric learning algorithm that learns an optimal metric at the query point. We learn a distance metric using a feature relevance measure inspired by boosting. The modified metric results in a smooth neighborhood that leads to better classification results. We tested our technique on major UCI machine learning databases and compared the results to state of the art techniques. Our method resulted in significant improvements in the performance of the K-NN classifier and also performed better than other techniques on major databases.
School/Discipline
Dissertation Note
Provenance
Description
Access Status
Rights
Copyright Springer-Verlag Berlin Heidelberg 2010