Please use this identifier to cite or link to this item:
|Scopus||Web of Science®||Altmetric|
|Title:||Density maximization for improving graph matching with its applications|
|Citation:||IEEE Transactions on Image Processing, 2015; 24(7):2110-2123|
|Publisher:||Institute of Electrical and Electronics Engineers Inc.|
|Chao Wang, Lei Wang, and Lingqiao Liu|
|Abstract:||Graph matching has been widely used in both image processing and computer vision domain due to its powerful performance for structural pattern representation. However, it poses three challenges to image sparse feature matching: 1) the combinatorial nature limits the size of the possible matches; 2) it is sensitive to outliers because its objective function prefers more matches; and 3) it works poorly when handling many-to-many object correspondences, due to its assumption of one single cluster of true matches. In this paper, we address these challenges with a unified framework called density maximization (DM), which maximizes the values of a proposed graph density estimator both locally and globally. DM leads to the integration of feature matching, outlier elimination, and cluster detection. Experimental evaluation demonstrates that it significantly boosts the true matches and enables graph matching to handle both outliers and many-to-many object correspondences. We also extend it to dense correspondence estimation and obtain large improvement over the state-of-the-art methods. We further demonstrate the usefulness of our methods using three applications: 1) instance-level image retrieval; 2) mask transfer; and 3) image enhancement.|
|Keywords:||Graph matching; sparse feature matching; dense correspondence; image retrieval; mask transfer; image enhancement|
|Rights:||© 2015 IEEE. Personal use is permitted, but republication, redistri bution requires IEEE permission.|
|Appears in Collections:||Aurora harvest 7|
Computer Science publications
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.