Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/116294
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Part-based robust tracking using online latent structured learning
Author: Yao, R.
Shi, Q.
Shen, C.
Zhang, Y.
Van Den Hengel, A.
Citation: IEEE Transactions on Circuits and Systems for Video Technology, 2017; 27(6):1235-1248
Publisher: IEEE
Issue Date: 2017
ISSN: 1051-8215
1558-2205
Statement of
Responsibility: 
Rui Yao, Qinfeng Shi, Chunhua Shen, Yanning Zhang and Anton van den Hengel
Abstract: Despite many advances made in the area, deformable targets and partial occlusions continue to represent key problems in visual tracking. Structured learning has shown good results when applied to tracking whole targets, but applying this approach to a part-based target model is complicated by the need to model the relationships between parts and to avoid lengthy initialization processes. We thus propose a method that models the unknown parts by using latent variables. In doing so, we extend the online algorithm Pegasos to the structured prediction case (i.e., predicting the location of the bounding boxes) with latent part variables. We also incorporate the very recently proposed spatial constraints to preserve distances between parts. To better estimate the parts, and to avoid overfitting caused by the extra model complexity/capacity introduced by the parts, we propose a two-stage training process based on the primal rather than the dual form. We then show that the method outperforms the state of the arts in extensive experiments.
Keywords: Online latent structured learning; part-based model; visual tracking
Rights: © 2016 IEEE
DOI: 10.1109/TCSVT.2016.2527358
Published version: http://dx.doi.org/10.1109/tcsvt.2016.2527358
Appears in Collections:Aurora harvest 3
Australian Institute for Machine Learning publications
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.