Please use this identifier to cite or link to this item:
Type: Conference paper
Title: Mitigating the effects of variable illumination for tracking across disjoint camera views
Author: Cheng, E.
Madden, C.
Picardi, M.
Citation: IEEE International Conference on Video and Signal Based Surveillance, Nov. 2006:www1-6
Publisher: IEEE
Publisher Place: Online
Issue Date: 2006
ISBN: 0769526888
Conference Name: IEEE Conference on Video and Signal Based Surveillance (2006 : Sydney, Australia)
Statement of
E.D. Cheng, C.Madden and M.Piccardi
Abstract: Tracking people by their appearance across disjoint camera views is challenging since appearance may vary significantly across such views. This problem has been tackled in the past by computing intensity transfer functions between each camera pair during an initial training stage. However, in real-life situations, intensity transfer functions depend not only on the camera pair, but also on the actual illumination at pixel-wise resolution and may prove impractical to estimate to a satisfactory extent. For this reason, in this paper we propose an appearance representation for people tracking capable of coping with the typical illumination changes occurring in a surveillance scenario. Our appearance representation is based on an online K-means color clustering algorithm, a fixed, data-dependent intensity transformation, and the incremental use of frames. Moreover, a similarity measurement is proposed to match the appearance representations of any two given moving objects along sequences of frames. Experimental results presented in this paper show that the proposed methods provides a viable while effective approach for tracking people across disjoint camera views in typical surveillance scenarios.
Description (link):
Appears in Collections:Aurora harvest 5
Computer Science publications

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.