Click on “Download PDF” for the PDF version or on the title for the HTML version.
If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.
Active-Laser Scanning and Intelligent Picking for Automated Loading of Agricultural Commodities to Processing Machines
Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org
Citation: 2022 ASABE Annual International Meeting 2200459.(doi:10.13031/aim.202200459)
Authors: Mohamed Amr Ali, Benjamin Wu, Caleb Wheeler, Dongyi Wang, Yang Tao
Keywords: active laser scanning, artificial intelligence, computer vision, depth imaging, machine vision, range imaging, robotics, smart food manufacturing.
Abstract. Robotic pick-and-place tasks are often standard procedures in automated industrial lines. The operation generally requires prior knowledge of the localization, orientation, and orderly discretization of products. Industrial lines for agriculture commodities are typically random and chaotic where the items arrive in disordered piles, thus favoring manual labor over automation. Ultimately, this yields low throughput lines and can be laborious for operators. This research develops a novel intelligent vision-guided robotics framework to load agricultural products to a processing machine. Our approach comprises three sections: 1) a novel active dual-laser scanning triangulation-based image acquisition system, 2) intelligent computer vision (CV) algorithms to detect each product instance within a pile, and 3) a 6 DOF robotic arm equipped with a pneumatically operated soft gripper for grasping items. The acquisition system captures colored images and reconstructs high-resolution depth maps with sub-millimeter resolution (accuracy of 0.024 mm and standard deviation of 0.104 mm). We show the uniqueness of our geometric configuration and how it enables us to resolve convex piles and concavities. Next, we present CV algorithms for localization, pose estimation, and discretization of objects in disordered piles. We experiment on Chesapeake Blue Crabs as one of Maryland's highly valued marine products. For RGBD instance segmentation, we utilize Mask R-CNN with keypoint detection. Our network achieved high COCO evaluation metrics of 83.4, 100, and 100 for Bounding Box AP, AP50, and AP75, and 61.20, 100.0, and 78.70 for Mask Segmentation AP, AP50, and AP75, respectively. The coordinates are relayed to the robot for automated loading.
(Download PDF) (Export to EndNotes)
|