Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/44768
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: Adaptive multiple object tracking using colour and segmentation cues
Author: Kumar, P.
Brooks, M.
Dick, A.
Citation: Computer Vision – ACCV 2007 / David Hutchison ... [et al.] (eds.):853-863
Publisher: Springer
Publisher Place: Germany
Issue Date: 2007
Series/Report no.: Lecture Notes in Computer Science ; 4843/2007
ISBN: 9783540763857
ISSN: 0302-9743
1611-3349
Conference Name: Asian Conference on Computer Vision (8th : 2007 : Tokyo, Japan)
Editor: Yasushi Yagi,
Statement of
Responsibility: 
Pankaj Kumar, Michael J. Brooks and Anthony Dick
Abstract: We consider the problem of reliably tracking multiple objects in video, such as people moving through a shopping mall or airport. In order to mitigate difficulties arising as a result of object occlusions, mergers and changes in appearance, we adopt an integrative approach in which multiple cues are exploited. Object tracking is formulated as a Bayesian parameter estimation problem. The object model used in computing the likelihood function is incrementally updated. Key to the approach is the use of a background subtraction process to deliver foreground segmentations. This enables the object colour model to be constructed using weights derived from a distance transform operating over foreground regions. Results from foreground segmentation are also used to gain improved localisation of the object within a particle filter framework. We demonstrate the effectiveness of the approach by tracking multiple objects through videos obtained from the CAVIAR dataset.
Description: The original publication can be found at www.springerlink.com
DOI: 10.1007/978-3-540-76386-4_81
Published version: http://www.springerlink.com/content/7964xp087207k311/
Appears in Collections:Aurora harvest
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.