Click on “Download PDF” for the PDF version or on the title for the HTML version.


If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.

Deep Data Augmentation for Weed Recognition Enhancement: A Diffusion Probabilistic Model and Transfer Learning Based Approach

Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org

Citation:  2023 ASABE Annual International Meeting  2300108.(doi:10.13031/aim.202300108)
Authors:   Dong Chen, Xinda Qi, Yu Zheng, Yuzhen Lu, Yanbo Huang, Zhaojian Li
Keywords:   computer vision, data augmentation, deep learning, generative modeling, precision agriculture

Abstract. Weed management plays an important role in crop yield and quality protection. Conventional weed control methods largely rely on herbicide application, which incurs significant management costs and poses environmental hazards to the environments and human health. Machine vision-based automated weeding has gained increasing attention for sustainable weed management through weed identification and site-specific treatments. However, it remains a challenging task to reliably recognize weeds in variable field conditions, which is largely because of the difficulty curating large-scale, expert-labeled weed image datasets collected from diverse field conditions. Data augmentation methods, including traditional geometric/color transformations and more advanced generative adversarial networks (GANs) can alleviate data collection and labeling efforts by algorithmically expanding the scale of datasets. Recently, diffusion models have emerged in the field of image synthesis, providing a new means for augmenting image datasets to power machine vision systems. This study presents the first investigation of the efficacy of diffusion models for generating weed images to enhance weed identification. Experiments on a 15-class large weed data (CottonWeedID15), showed that diffusion models yielded achieved the best trade-off between sample fidelity and diversity and obtained the highest Fréchet Inception Distance, compared to GANs (BigGAN, StyleGAN2, StyleGAN3).

(Download PDF)    (Export to EndNotes)