Click on “Download PDF” for the PDF version or on the title for the HTML version.
If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.
Pest-infested Soybean Leaf Image Classification with Deep Learning Techniques for Integrated Pest Management (IPM)
Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org
Citation: 2022 ASABE Annual International Meeting 2201096.(doi:10.13031/aim.202201096)
Authors: Chetan Badgujar, Hasib Mansur, Daniel Flippo
Keywords: Deep learning, Image classification, Integrated Pest Management, Machine learning, Soybean, Transfer learning.
Abstract. Approximately 45% of annual food production is lost due to pest infestation. To alleviate this, pesticides are extensively used in modern agriculture. Soybean is a major crop in the USA, accounting for 22% of total pesticide consumption. The current U.S. agricultural practices apply pesticides uniformly to the entire field. The excessive use of pesticides puts an unsustainable chemical load on the environment. Reduction in pesticides, is a need of the hour to preserve human health and the environment while maintaining the necessary level of food production. The excessive use of pesticides can be potentially averted with integrated pest management (IPM). However, crop monitoring, insect pest identification, and severity assessment are critical components of IPM success. Therefore, identifying the pest type and locating pest outbreak sites in soybean fields is the first step in developing an effective pest control management system. This study uses an open-source dataset of insect-damaged soybean leaves collected under realistic weather conditions using a cell phone and UAV cameras. The dataset includes 6410 images classified into three classes i.e, healthy plants, plants infected by caterpillars, and plants infected by diabrotica speciosa. Pretrained deep learning models such as 1) VGG16, 2) ResNet50, 3) InceptionV3, and 4) DenseNet201 were used with a transfer learning approach to classify the pest-damaged images accurately. The success of designed models was evaluated with classification accuracy and confusion matrix. The overall classification accuracy was 88%, 86%, 84% and 74% respectively for DenseNet201, VGG16, InceptionV3 and ResNet50. The DenseNet201 outperformed the other models in the study and is recommended for the soybean image classification problem.
(Download PDF) (Export to EndNotes)
|