Click on “Download PDF” for the PDF version or on the title for the HTML version.

If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.

Deep Learning-Based Disease Identification and Severity Estimation Tool for Tar Spot in Corn

Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan

Citation:  2022 ASABE Annual International Meeting  2201193.(doi:10.13031/aim.202201193)
Authors:   Aanis Ahmad, Varun Aggarwal, Dharmendra Saraswat, Aly El Gamal, Guri Johal
Keywords:   Deep Learning, Computer Vision, Tar Spot, Disease Identification, Severity Estimation

Abstract. Management of tar spot disease in corn traditionally relied on manual field scouting and visual analysis since it was first detected in the United States in 2015. Recent studies have reported computer vision-based applications for diagnosing tar spot disease lesions. However, human raters outperformed traditional computer vision approaches. In the past five to six years, deep learning techniques have showed promising results for different precision agriculture applications including fruit detection and counting, disease identification in various crops, and weed detection. However, limited studies have been conducted for developing deep learning-based disease identification and severity estimation tools for tar spot in corn. Therefore, in this study, a custom handheld imagery dataset consisting of 455 images was acquired for the tar spot disease in a greenhouse with complex background conditions. It was used to train deep learning-based disease identification models. The dataset was enhanced by combining with a publicly available CD&S dataset consisting of images for other corn diseases, namely, Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS). Image classification models were first trained to identify tar spot, NLB, GLS, and NLS diseases on diseased corn foliage. To accurately locate and identify tar spot disease lesions, the YOLOv4 object detection model was then trained. In addition, semantic segmentation models were trained using the UNet architecture for leaf and lesion segmentation. After training the image classification models, the highest testing accuracy of 99.41% was achieved with the DenseNet169 model. For YOLOv4 object detection, the highest mAP of 41% was achieved for locating and identifying tar spot disease lesions. Finally, for UNet semantic segmentation, the mIoU for leaf and lesion segmentation were 0.80 and 0.28, respectively. Therefore, traditional color histogram thresholding was used for segmenting the tar spot lesions, along with deep learning techniques, to develop a novel severity estimation framework. After evaluating the three models, the image classification model was deployed on a web application. Additionally, the model was deployed for a smartphone application to enable real-time analysis. Overall, is this study, different deep learning models were trained to accurately identify tar spot disease and its severity using images acquired under complex background conditions. In addition, a disease diagnosis tool was developed to help farmers accurately diagnose tar spot disease of corn.

(Download PDF)    (Export to EndNotes)