Multi-modal few-shot learning for anthesis prediction of individual wheat plants
Date
2025
Authors
Xie, Y.
Roy, S.J.
Schilling, R.K.
Liu, H.
Editors
Advisors
Journal Title
Journal ISSN
Volume Title
Type:
Journal article
Citation
Plant Phenomics, 2025; 7(3):100091-1-100091-18
Statement of Responsibility
Yiting Xie, Stuart J. Roy, Rhiannon K. Schilling, Huajian Liu
Conference Name
Abstract
Anthesis prediction is crucial for breeding wheat. While current tools provide estimates of average anthesis at the field scale, they fail to address the needs of breeders who require accurate predictions for individual plants. Hybrid breeders have to finalize their plans for pollination at least 10 days before such flowering is due and biotechnology field trials in the United States and Australia must report to regulators 7–14 days before the first plant flowers. Currently, predicting anthesis of individual wheat plants is a labour-intensive, inefficient, and costly process. Individual wheat of the same cultivar within the same field may exhibit substantial variations in anthesis timing, due to significant variations in their immediate surroundings. In this study, we developed an efficient and cost-effective machine vision approach to predict anthesis of individual wheat plants. By integrating RGB imagery with in-situ meteorological data, our multimodal framework simplifies the anthesis prediction problem into binary or three-class classification tasks, aligning with breeders' requirements in individual wheat flowering prediction on the crucial days before anthesis. Furthermore, we incorporated a few-shot learning method to improve the model's adaptability across different growth environments and to address the challenge of limited training data. The model achieved an F1 score above 0.8 in all planting settings.
School/Discipline
Dissertation Note
Provenance
Description
Access Status
Rights
© 2025 The Authors. Published by Elsevier B.V. on behalf of Nanjing Agricultural University. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).