Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Type: Journal article
Title: Sampling minimal subsets with large spans for robust estimation
Author: Quoc, H.
Chin, T.
Chojnacki, W.
Suter, D.
Citation: International Journal of Computer Vision, 2014; 106(1):93-112
Publisher: Springer Verlag
Issue Date: 2014
ISSN: 0920-5691
Statement of
Quoc Huy Tran, Tat-Jun Chin, Wojciech Chojnacki, David Suter
Abstract: When sampling minimal subsets for robust parameter estimation, it is commonly known that obtaining an all-inlier minimal subset is not sufficient; the points therein should also have a large spatial extent. This paper investigates a theoretical basis behind this principle, based on a little known result which expresses the least squares regression as a weighted linear combination of all possible minimal subset estimates. It turns out that the weight of a minimal subset estimate is directly related to the span of the associated points. We then derive an analogous result for total least squares which, unlike ordinary least squares, corrects for errors in both dependent and independent variables. We establish the relevance of our result to computer vision by relating total least squares to geometric estimation techniques. As practical contributions, we elaborate why naive distance-based sampling fails as a strategy to maximise the span of all-inlier minimal subsets produced. In addition we propose a novel method which, unlike previous methods, can consciously target all-inlier minimal subsets with large spans.
Keywords: Least squares
Total least squares
Minimal subsets
Robust fitting
Hypothesis sampling
Rights: © Springer Science+Business Media New York 2013
DOI: 10.1007/s11263-013-0643-y
Published version:
Appears in Collections:Aurora harvest 2
Computer Science publications

Files in This Item:
File Description SizeFormat 
  Restricted Access
Restricted Access1.79 MBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.