Unshuffling Data for Improved Generalization in Visual Question Answering
Files
(Submitted version)
Date
2021
Authors
Teney, D.
Abbasnejad, E.
van den Hengel, A.
Editors
Advisors
Journal Title
Journal ISSN
Volume Title
Type:
Conference paper
Citation
Proceedings / IEEE International Conference on Computer Vision. IEEE International Conference on Computer Vision, 2021, pp.1397-1407
Statement of Responsibility
Damien Teney, Ehsan Abbasnejad, Anton van den Hengel
Conference Name
IEEE/CVF International Conference on Computer Vision (ICCV) (11 Oct 2021 - 17 Oct 2021 : Virtual Online)
Abstract
Generalization beyond the training distribution is a core challenge in machine learning. The common practice of mixing and shuffling examples when training neural networks may not be optimal in this regard. We show that partitioning the data into well-chosen, non-i.i.d. subsets treated as multiple training environments can guide the learning of models with better out-of-distribution generalization. We describe a training procedure to capture the patterns that are stable across environments while discarding spurious ones. The method makes a step beyond correlation-based learning: the choice of the partitioning allows injecting information about the task that cannot be otherwise recovered from the joint distribution of the training data. We demonstrate multiple use cases with the task of visual question answering, which is notorious for dataset biases. We obtain significant improvements on VQA-CP, using environments built from prior knowledge, existing meta data, or unsupervised clustering. We also get improvements on GQA using annotations of “equivalent questions”, and on multidataset training (VQA v2 / Visual Genome) by treating them as distinct environments.
School/Discipline
Dissertation Note
Provenance
Description
Access Status
Rights
©2021 IEEE