Please use this identifier to cite or link to this item: http://hdl.handle.net/2440/115580
Citations
Scopus Web of ScienceĀ® Altmetric
?
?
Type: Journal article
Title: Embedding based on function approximation for large scale image search
Author: Do, T.
Cheung, N.
Citation: IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018; 40(3):626-638
Publisher: IEEE
Issue Date: 2018
ISSN: 0162-8828
2160-9292
Statement of
Responsibility: 
Thanh-Toan Do and Ngai-Man Cheung
Abstract: The objective of this paper is to design an embedding method that maps local features describing an image (e.g., SIFT) to a higher dimensional representation useful for the image retrieval problem. First, motivated by the relationship between the linear approximation of a nonlinear function in high dimensional space and the state-of-the-art feature representation used in image retrieval, i.e., VLAD, we propose a new approach for the approximation. The embedded vectors resulted by the function approximation process are then aggregated to form a single representation for image retrieval. Second, in order to make the proposed embedding method applicable to large scale problem, we further derive its fast version in which the embedded vectors can be efficiently computed, i.e., in the closed-form. We compare the proposed embedding methods with the state of the art in the context of image search under various settings: when the images are represented by medium length vectors, short vectors, or binary vectors. The experimental results show that the proposed embedding methods outperform existing the state of the art on the standard public image retrieval benchmarks.
Keywords: Image search; function approximation; feature embedding
Rights: 2017 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
RMID: 0030083955
DOI: 10.1109/TPAMI.2017.2686861
Appears in Collections:Electrical and Electronic Engineering publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.