Please use this identifier to cite or link to this item:
|Mitigating the effects of variable illumination for tracking across disjoint camera views
|IEEE International Conference on Video and Signal Based Surveillance, Nov. 2006:www1-6
|IEEE Conference on Video and Signal Based Surveillance (2006 : Sydney, Australia)
|E.D. Cheng, C.Madden and M.Piccardi
|Tracking people by their appearance across disjoint camera views is challenging since appearance may vary significantly across such views. This problem has been tackled in the past by computing intensity transfer functions between each camera pair during an initial training stage. However, in real-life situations, intensity transfer functions depend not only on the camera pair, but also on the actual illumination at pixel-wise resolution and may prove impractical to estimate to a satisfactory extent. For this reason, in this paper we propose an appearance representation for people tracking capable of coping with the typical illumination changes occurring in a surveillance scenario. Our appearance representation is based on an online K-means color clustering algorithm, a fixed, data-dependent intensity transformation, and the incremental use of frames. Moreover, a similarity measurement is proposed to match the appearance representations of any two given moving objects along sequences of frames. Experimental results presented in this paper show that the proposed methods provides a viable while effective approach for tracking people across disjoint camera views in typical surveillance scenarios.
|Appears in Collections:
|Aurora harvest 5
Computer Science publications
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.