Click on “Download PDF” for the PDF version or on the title for the HTML version.
If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.
Using images from a handheld camera to detect wheat bacterial leaf streak disease severities
Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org
Citation: 2021 ASABE Annual International Virtual Meeting 2100112.(doi:10.13031/aim.202100112)
Authors: Nusrat Jahan, Zhao Zhang, Zhaohui Liu, Andrew Friskop, Paulo Flores, Jithin Jose Mathew, Anup Kumar Das
Keywords: deep learning, GoogLeNet, handheld camera, ResNet, segmentation, VGG16, wheat bacterial leaf streak
Abstract. North Dakota ranks among the top states in wheat production in the U.S. Wheat diseases can decrease crop yield and grain quality significantly, negatively impacting growers' income. Among these diseases, wheat bacterial leaf streak (WBLS) (caused by Xanthomonas translucens) deserves special attention. Though having several disadvantages (e.g., inefficiency, subjectivity, and errors associated with evaluators' fatigue), the current mainstream approach for WBLS disease monitoring is still manually/visually based. A desirable approach would be an automated method based on images and algorithms that can accurately identify and classify WBLS severities. The recent development of deep learning (DL) can be a potential candidate to serve this purpose. This study explored the performance of different DL models on WBLS severity evaluation. High-resolution RGB images of wheat plots were collected for seven days (Rounds), and an automatic procedure is implemented to segment the wheat crops in row format from the noisy background (e.g., soil). Plant pathologists walked through the field and visually graded the disease severity conditions (Score 1 to 5), which was used as the ground truth data. Three deep learning models (e.g., GoogleNet, ResNet, and VGG16) were tested in this study, with the highest test accuracy obtained from GoogleNet being 98% for Round 5. For ResNet and VGG16, the highest accuracy was achieved as 95% (Round 4) and 84% (Round 6), respectively. Comparing these three highest accuracies, the GoogleNet model secures the most satisfactory outcome with 98% accuracy. Hence, images collected by a handheld camera coupled with GoogLeNet are a potential automatic method for WBLS severity monitoring, with an accuracy of 98%.
(Download PDF) (Export to EndNotes)