Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/108759
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: Long-term correlation tracking
Author: Ma, C.
Yang, X.
Zhang, C.
Yang, M.-H.
Citation: Proceedings / CVPR, IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2015, pp.5388-5396
Publisher: IEEE
Issue Date: 2015
ISBN: 9781467369657
ISSN: 1063-6919
1063-6919
Conference Name: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2015) (7 Jun 2015 - 12 Jun 2015 : Boston, MA)
Statement of
Responsibility: 
Chao Ma, Xiaokang Yang, Chongyang Zhang, and Ming-Hsuan Yang
Abstract: In this paper, we address the problem of long-term visual tracking where the target objects undergo significant appearance variation due to deformation, abrupt motion, heavy occlusion and out-of-view. In this setting, we decompose the task of tracking into translation and scale estimation of objects. We show that the correlation between temporal context considerably improves the accuracy and reliability for translation estimation, and it is effective to learn discriminative correlation filters from the most confident frames to estimate the scale change. In addition, we train an online random fern classifier to re-detect objects in case of tracking failure. Extensive experimental results on large-scale benchmark datasets show that the proposed algorithm performs favorably against state-of-the-art methods in terms of efficiency, accuracy, and robustness.
Keywords: Target tracking; correlation; context; detectors; context modeling; estimation
Rights: © 2015 IEEE
DOI: 10.1109/CVPR.2015.7299177
Published version: http://dx.doi.org/10.1109/cvpr.2015.7299177
Appears in Collections:Aurora harvest 8
Computer Science publications

Files in This Item:
File Description SizeFormat 
RA_hdl_108759.pdf
  Restricted Access
Restricted Access3.05 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.