Click on “Download PDF” for the PDF version or on the title for the HTML version.


If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.

UAS and UGV-Based Disease Management  System for Diagnosing Corn Diseases Above  and Below the Canopy Using Deep Learning

Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org

Citation:  Journal of the ASABE. 67(1): 161-179. (doi: 10.13031/ja.15625) @2024
Authors:   Aanis Ahmad, Varun Aggarwal, Dharmendra Saraswat, Gurmukh S. Johal
Keywords:   Deep learning, Disease identification, Disease management, Object detection, UAS, UGV, YOLOv7.

Highlights

Generated custom imagery dataset using UAS, UGV, and handheld sensors separately in diseased corn fields.

Proposed data pipeline utilizing Google Sheets API to establish communication between each platform and enable access through a web application.

Developed deep learning-based disease management system for above and below-the-canopy corn disease diagnosis.

Trained and evaluated disease detection models from each platform separately to provide management recommendations.

Abstract. Early disease management following the onset of disease symptoms is crucial for controlling their spread. Heterogenous collaboration between unmanned aerial systems (UAS) and unmanned ground vehicles (UGV) for field scouting and disease diagnosis is a potential approach for developing automated disease management solutions. However, automation of crop-specific disease identification requires the use of above and below-canopy sensors and properly trained deep learning (DL) models. This research proposes to develop a novel disease management system for diagnosing corn diseases from above and below the canopy by collaboratively using edge devices mounted on UAS and UGV, respectively. Three separate datasets were acquired using UAS above the canopy, UGV below the canopy, and handheld imaging platforms within diseased corn fields. DL-based image classification models were first trained for identifying common corn diseases under field conditions, resulting in up to 95.04% testing accuracy using the DenseNet169 architecture. After creating bounding box annotations for disease images, You Only Look Once (YOLO)v7 DL-based object detection models were trained to identify diseases from each platform separately. Trained YOLOv7 models resulted in the highest mAP@IoU=0.5 of 37.6%, 46.4%, and 72.2% for locating and identifying diseases above the canopy using UAS, below the canopy using UGV, and handheld sensors, respectively. A client/server architecture was developed to establish communication between the UAS, UGV, and Google Spreadsheets via Wi-Fi communication protocol. The coordinates of diseased regions and distinct disease types were recorded using the client/server architecture on Google Spreadsheets. A web application utilized the data from the Google Spreadsheet to help users diagnose diseases in real time and provide recommendations for implementing appropriate disease management practices. This study reports findings of independently operated UAS and UGV that can potentially offer disease spread from below and over the corn canopy by combining the information.

(Download PDF)    (Export to EndNotes)