Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/85599
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Efficient dual approach to distance metric learning
Author: Shen, C.
Kim, J.
Liu, F.
Wang, L.
van den Hengel, A.
Citation: IEEE Transactions on Neural Networks and Learning Systems, 2014; 25(2):394-406
Publisher: Institute of Electrical and Electronics Engineers
Issue Date: 2014
ISSN: 2162-237X
2162-2388
Statement of
Responsibility: 
Chunhua Shen, Junae Kim, Fayao Liu, Lei Wang and Anton van den Hengel
Abstract: Distance metric learning is of fundamental interest in machine learning because the employed distance metric can significantly affect the performance of many learning methods. Quadratic Mahalanobis metric learning is a popular approach to the problem, but typically requires solving a semidefinite programming (SDP) problem, which is computationally expensive. The worst case complexity of solving an SDP problem involving a matrix variable of size D × D with O(D) linear constraints is about O(D6.5) using interior-point methods, where D is the dimension of the input data. Thus, the interior-point methods only practically solve problems exhibiting less than a few thousand variables. Because the number of variables is D(D + 1)/2, this implies a limit upon the size of problem that can practically be solved around a few hundred dimensions. The complexity of the popular quadratic Mahalanobis metric learning approach thus limits the size of problem to which metric learning can be applied. Here, we propose a significantly more efficient and scalable approach to the metric learning problem based on the Lagrange dual formulation of the problem. The proposed formulation is much simpler to implement, and therefore allows much larger Mahalanobis metric learning problems to be solved. The time complexity of the proposed method is roughly O(D3), which is significantly lower than that of the SDP approach. Experiments on a variety of data sets demonstrate that the proposed method achieves an accuracy comparable with the state of the art, but is applicable to significantly larger problems. We also show that the proposed method can be applied to solve more general Frobenius norm regularized SDP problems approximately.
Keywords: Convex optimization; Lagrange duality; Mahalanobis distance; metric learning; semidefinite programming (SDP)
Rights: © 2013 IEEE
DOI: 10.1109/TNNLS.2013.2275170
Grant ID: http://purl.org/au-research/grants/arc/LP120200485
http://purl.org/au-research/grants/arc/FT120100969
Published version: http://dx.doi.org/10.1109/tnnls.2013.2275170
Appears in Collections:Aurora harvest 7
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.