Please use this identifier to cite or link to this item:
|Scopus||Web of Science®||Altmetric|
|Title:||Coarse-to-fine: A RNN-based hierarchical attention model for vehicle re-identification|
|Citation:||Proceedings of the 14th Asian Conference on Computer Vision (ACCV 2018), as published in Lecture Notes in Computer Science, 2019 / vol.11362, pp.575-591|
|Series/Report no.:||Lecture Notes in Computer Science; 11362|
|Conference Name:||Asian Conference on Computer Vision (ACCV) (02 Dec 2018 - 06 Dec 2018 : Perth, Australia)|
|Xiu-Shen Wei, B, Chen-Lin Zhang, Lingqiao Liu, Chunhua Shen, and Jianxin Wu|
|Abstract:||Vehicle re-identification is an important problem and becomes desirable with the rapid expansion of applications in video surveillance and intelligent transportation. By recalling the identification process of human vision, we are aware that there exists a native hierarchical dependency when humans identify different vehicles. Specifically, humans always firstly determine one vehicle’s coarse-grained category, i.e., the car model/type. Then, under the branch of the predicted car model/type, they are going to identify specific vehicles by relying on subtle visual cues, e.g., customized paintings and windshield stickers, at the fine-grained level. Inspired by the coarse-to-fine hierarchical process, we propose an end-to-end RNN-based Hierarchical Attention (RNN-HA) classification model for vehicle re-identification. RNN-HA consists of three mutually coupled modules: the first module generates image representations for vehicle images, the second hierarchical module models the aforementioned hierarchical dependent relationship, and the last attention module focuses on capturing the subtle visual information distinguishing specific vehicles from each other. By conducting comprehensive experiments on two vehicle re-identification benchmark datasets VeRi and VehicleID, we demonstrate that the proposed model achieves superior performance over state-of-the-art methods.|
|Keywords:||Vehicle re-identification; Hierarchical dependency; Attention mechanism; Deep learning|
|Rights:||© Springer Nature Switzerland AG 2019|
|Appears in Collections:||Computer Science publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.