Click on “Download PDF” for the PDF version or on the title for the HTML version.
If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.
A novel weed and crop recognition technique for robotic weed control in a lettuce field with high weed densities
Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.orgCitation: 2019 ASABE Annual International Meeting 1900029.(doi:10.13031/aim.201900029)
Authors: R Raja, Dr., David C Slaughter, Dr., Steve Fennimore, Dr.
Keywords: Machine vision, robotic weed control, weed vs crop.
Abstract. This paper presents a novel technique to crop signaling to classify weed and crop for robotic weed control for leafy vegetables. It is a big challenge to classify weeds from the crop plants in a field with high weed densities in the process of automatic weed control due to the high levels of visual occlusion that impair visibility. In this work, we have developed an advanced computer vision algorithm to detect and classify in-row weed-crops by employing the technique called crop signaling, developed by the University of California Davis. It is a simple and low-cost technique, in which the crop plants are marked with a unique machine-readable signaling compound before planting. Thereafter, the crop signal is used for rest of the season to automatically detect and classify crop weed-crop by the smart machine for automatic weed control. Although the technique can be adopted for any other crop, the algorithm presented in this paper is specially developed for a vision based weed-spraying control system for lettuce field. The algorithm is capable of recognizing and distinguishing weeds from crop plants. The crop detection accuracy is measured at 99% and, 98.11% of sprayable weeds detected with a detection time of 1.2 seconds per pair of images. This technique is highly accurate, reliable and robust compared to other sensor-based techniques in this situation.(Download PDF) (Export to EndNotes)