Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Type: Journal article
Title: LOW: Training deep neural networks by learning optimal sample weights
Author: Santiago, C.
Barata, C.
Sasdelli, M.
Carneiro, G.
Nascimento, J.C.
Citation: Pattern Recognition, 2021; 110:1-12
Publisher: Elsevier
Issue Date: 2021
ISSN: 0031-3203
Statement of
Carlos Santiago, Catarina Barata, Michele Sasdelli, Gustavo Carneiro, Jacinto C. Nascimento
Abstract: The performance of deep learning (DL) models is highly dependent on the quality and size of the training data, whose annotations are often expensive and hard to obtain. This work proposes a new strategy to train DL models by Learning Optimal samples Weights (LOW), making better use of the available data. LOW determines how much each sample in a batch should contribute to the training process, by automatically estimating its weight in the loss function. This effectively forces the model to focus on more relevant samples. Consequently, the models exhibit a faster convergence and better generalization, specially on imbalanced data sets where class distribution is long-tailed. LOW can be easily integrated to train any DL model and can be combined with any loss function, while adding marginal computational burden to the training process. Additionally, the analysis of how sample weights change during training provides insights on what the model is learning and which samples or classes are more challenging. Results on popular computer vision benchmarks and on medical data sets show that DL models trained with LOW perform better than with other state-of-the-art strategies.
Keywords: Deep learning; sample weighting; imbalanced data sets
Rights: © 2020 Elsevier Ltd. All rights reserved.
DOI: 10.1016/j.patcog.2020.107585
Grant ID:
Published version:
Appears in Collections:Aurora harvest 4
Computer Science publications

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.