Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTissainayagam, P.-
dc.contributor.authorSuter, D.-
dc.identifier.citationPattern Recognition, 2005; 38(1):105-113-
dc.descriptionCopyright © 2005 Pattern Recognition Society Published by Elsevier B.V.-
dc.description.abstractThis paper presents an object tracking technique based on the Bayesian multiple hypothesis tracking (MHT) approach. Two algorithms, both based on the MHT technique are combined to generate an object tracker. The first MHT algorithm is employed for contour segmentation. The segmentation of contours is based on an edge map. The segmented contours are then merged to form recognisable objects. The second MHT algorithm is used in the temporal tracking of a selected object from the initial frame. An object is represented by key feature points that are extracted from it. The key points (mostly corner points) are detected using information obtained from the edge map. These key points are then tracked through the sequence. To confirm the correctness of the tracked key points, the location of the key points on the trajectory are verified against the segmented object identified in each frame. If an acceptable number of key-points lie on or near the contour of the object in a particular frame (n-th frame), we conclude that the selected object has been tracked (identified) successfully in frame n.-
dc.description.statementofresponsibilityP. Tissainayagam and D. Suter-
dc.publisherPergamon-Elsevier Science Ltd-
dc.subjectObject tracking-
dc.subjectKey points-
dc.subjectMultiple Hypothesis Tracking-
dc.subjectContour segmentation-
dc.subjectEdge grouping-
dc.titleObject tracking in image sequences using point features-
dc.typeJournal article-
dc.identifier.orcidSuter, D. [0000-0001-6306-3023]-
Appears in Collections:Aurora harvest 5
Computer Science publications

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.