Click on “Download PDF” for the PDF version or on the title for the HTML version. If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options. Deep Neural Networks for Weed Detections Towards Precision WeedingPublished by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org Citation: 2022 ASABE Annual International Meeting 2200845.(doi:10.13031/aim.202200845)Authors: Abdur Rahman, Yuzhen Lu, Haifeng Wang Keywords: Cotton Weed Control, Deep Learning, Precision Agriculture, Weed Dataset, Weed Detection. ABSTRACT. Alternative non-chemical or chemical-reduced weed control methods, especially for herbicide- resistant weeds, are critical for long-term and integrated weed management. Through weed detection and localization, machine vision technology has the potential to enable site- and species-specific treatments targeting individual weed plants. However, due to unstructured field circumstances and large biological variability of weeds, robust and accurate weed detection remains a challenging endeavor. Deep learning (DL) algorithms, powered by large-scale image data, promise to achieve the weed detection performance required for precision weeding. In this study, a three-class weed dataset with bounding box annotations was curated, consisting of 848 color images collected in cotton fields under variable field conditions. A set of weed detection models were built using DL-based one-stage and two-stage object detectors, including YOLOv5, RetinaNet, EfficientDet, and Faster RCNN, by transferring pretrained the object detection models to the weed dataset. RetinaNet (R101-FPN), despite its longer inference time, achieved the highest overall detection accuracy with a mean average precision (mAP@0.50) of 79.98%. YOLOv5n showed the potential for real-time deployment in resource-constraint devices because of the smallest number of model parameters (1.8 million) and the fastest inference (17 ms on the Google Colab) while achieving comparable detection accuracy (76.58% mAP@0.50). Data augmentation through geometric and color transformation enhanced the accuracy of the weed detection models by the maximum of 4.2%. The software programs and the weed dataset used in this study are made publicly available (https://github.com/abdurrahman1828/DNNs-for-Weed-Detections; www.kaggle.com/yuzhenlu/cottonweeddet3). (Download PDF) (Export to EndNotes)
|