Click on “Download PDF” for the PDF version or on the title for the HTML version.

If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.


Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan

Citation:  2021 ASABE Annual International Virtual Meeting  2100566.(doi:10.13031/aim.202100566)
Authors:   Aanis Ahmad, Dharmendra Saraswat, Aly El Gamal, Gurmukh S Johal
Keywords:   Corn Diseases, Datasets, Deep Learning, Disease Identification, Disease Spread Tracking, Image Classification, Object Detection, Severity Estimation, UAV Imagery, YOLOv4.

Abstract. Accurately locating disease outbreak sites in corn fields is a first step in developing an effective disease management system. Such a system will lead to accurate disease identification and track the extent of severity estimation. In an effort to develop an improved disease management system, the research community has been increasingly relying on images acquired by Unmanned Aerial Vehicles (UAVs) and/or handheld sensors instead of using a manual scouting approach. However, most of the reported studies have used publicly available datasets consisting of images acquired under the uniform indoor settings. The current study is based on images taken by a UAV and handheld sensor from actual corn fields located at Purdue‘s Agronomy Center for Research and Education (ACRE) in summer 2020. A total of 55 UAV flights were conducted from June 20 through September 29 over 3 different corn fields, resulting in a collection of approximately 59000 images. Three neural networks namely VGG16, ResNet50, and InceptionV3 were used to train the image classification models for diseased area identification, disease type identification, and severity estimation. The UAV acquired images were split into 250 x 250 pixels to identify the presence of diseased patches within fields with almost perfect accuracies. Different disease types were identified by acquiring a total of 1500 handheld images for Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS) and accuracies up to 98.34% was obtained. The disease severity levels of NLS was estimated with accuracies up to 94.02% by using a total of 1200 handheld images that were acquired at severity levels corresponding to low, medium, and high. Finally, an object-detection algorithm, YOLOv4, was used for identifying and locating multiple instances of disease lesions within the NLB, GLS, and NLS handheld images with a mean Average Precision (mAP) score of 57.93%. This study presents details about four elements of a disease management system and provide promising results for realizing a working system.

(Download PDF)    (Export to EndNotes)