Click on “Download PDF” for the PDF version or on the title for the HTML version.
If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.
DeepCottonWeeds (DCW): A Novel Benchmark of YOLO Object Detectors for Weed Detection in Cotton Production Systems
Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org
Citation: 2022 ASABE Annual International Meeting 2200214.(doi:10.13031/aim.202200214)
Authors: Fengying Dang, Dong Chen, Yuzhen Lu, Zhaojian Li, Yu Zheng
Keywords: Cotton, computer vision, deep learning, precision agriculture, robotic weeding, weed detection.
Abstract. Weeds are among the major threats to cotton production. Overreliance on herbicides for weed control has accelerated the evolution of herbicide-resistance in weeds and brought increasing concerns about environments, food safety and human health. Machine vision systems for automated or robotic weeding have received significant interest in integrated, sustainable weed management. However, in the presence of unstructured field conditions and significant biological variability of weeds, it still remains a challenging task to develop robust weed identification and detection systems. Two critical obstacles to addressing this challenge include the lack of dedicated, large-scale image datasets of weeds specific to crop (cotton) production and the development of machine learning models for weed detection. This study presents a new dataset of weeds important to the U.S. cotton production systems, consisting of 5648 images of 12 weed classes with a total of 9370 bounding box annotation, collected under natural light conditions and at varied weed growth stages in cotton fields. Furthermore, a comprehensive benchmark of 18 selected state-of-the-art YOLO object detectors, involving YOLOv3, YOLOv4, Scaled-YOLOv, and YOLOv5, was established for weed detection on the dataset. The detection accuracy in terms of mAP@50 ranged from 88.14% by YOLOv3-tiny to 95.22% by YOLOv4, and the accuracy in terms of mAP@50[0.5:0.95] ranged from 68.18% by YOLOv3-tiny to 89.72 by Scaled-YOLOv4. All the YOLO models especially YOLOv5n and YOLOv5s showed great potential for real-time weed detection, and data augmentation could increase weed detection accuracy. Both the codes and weed dataset for model benchmarking are publicly available (https://github.com/Derekabc/DCW), which are expected to be valuable resources for future research in the field of weed detection and beyond.
(Download PDF) (Export to EndNotes)
|