Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Type: Conference paper
Title: Proximal riemannian pursuit for large-scale trace-norm minimization
Author: Tan, M.
Xiao, S.
Gao, J.
Xu, D.
Van Den Hengel, A.
Shi, Q.
Citation: Proceedings of the I29th IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2016), 2016 / vol.2016-December, pp.5877-5886
Publisher: IEEE
Issue Date: 2016
Series/Report no.: IEEE Conference on Computer Vision and Pattern Recognition
ISBN: 9781467388511
ISSN: 1063-6919
Conference Name: 29th IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2016) (26 Jun 2016 - 01 Jul 2016 : Las Vegas, NV)
Statement of
Mingkui Tan, Shijie Xiao, Junbin Gao, Dong Xu, Anton van den Hengel, Qinfeng Shi
Abstract: Trace-norm regularization plays an important role in many areas such as computer vision and machine learning. When solving general large-scale trace-norm regularized problems, existing methods may be computationally expensive due to many high-dimensional truncated singular value decompositions (SVDs) or the unawareness of matrix ranks. In this paper, we propose a proximal Riemannian pursuit (PRP) paradigm which addresses a sequence of trace-norm regularized subproblems defined on nonlinear matrix varieties. To address the subproblem, we extend the proximal gradient method on vector space to nonlinear matrix varieties, in which the SVDs of intermediate solutions are maintained by cheap low-rank QR decompositions, therefore making the proposed method more scalable. Empirical studies on several tasks, such as matrix completion and low-rank representation based subspace clustering, demonstrate the competitive performance of the proposed paradigms over existing methods.
Rights: © 2016 IEEE
RMID: 0030056381
DOI: 10.1109/CVPR.2016.633
Grant ID:
Appears in Collections:Computer Science publications

Files in This Item:
File Description SizeFormat 
RA_hdl_105570.pdfRestricted Access278.5 kBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.