Click on “Download PDF” for the PDF version or on the title for the HTML version.

If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.

An Unmanned Aerial Vehicle for Greenhouse Navigation and Video-Based Tomato Phenotypic Data Collection

Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan

Citation:  2021 ASABE Annual International Virtual Meeting  2100556.(doi:10.13031/aim.202100556)
Authors:   Jing-Heng Lin, Ta-Te Lin
Keywords:   automation, unmanned aerial vehicle, navigation, localization and mapping, phenotyping

Abstract. Despite the fast development of plant genotyping, the progress in plant phenotyping remains the bottleneck for the precise and accurate recording of important agronomical traits. In plant phenotyping, a monitoring system plays a key role for optimizing crop growth and yield. An unmanned aerial vehicle (UAV), unlike fixed-point sensors, can be used as a total solution for precise data collection in a greenhouse due to its mobility. However, unlike outdoor agricultural fields, greenhouses are closed and usually organized in a sophisticated way, which means that GPS data are not precise enough to be used for route planning; in addition, there are also many obstacles. For these reasons, a UAV should deal with these issues with the support of an indoor UAV navigation system. In this paper, a UAV that can perform automatic tomato phenotypic data collection with greenhouse flight navigation support was developed. The UAV consists of a quadcopter, mounted with a single-board computer, depth camera, Time-of-Flight (ToF) sensor, optical flow sensor and 2-dimensional Lidar. The flight of the UAV was stabilized using Pixhawk 4, an open-source flight controller with low-noise onboard IMUs. The navigation method features two parts: route planning and obstacle avoidance. A single-board computer was utilized to receive 2-dimensional distances from the 2D Lidar, thereby carrying out simultaneous localization and mapping (SLAM) and obstacle distance control. The data of the infrared ToF sensor and optical flow sensor were fused to measure the speed and visual motion of traveling for stabilization purpose. The depth camera was used for phenotypic data collection by depth video streaming. The depth data were used to extract distance information, estimate fruit size, and ignore fruits that were out of range. The presented approach can be applied not only for greenhouse monitoring and phenotyping, but also for other applications such as farm management and unmanned warehouse automation.

(Download PDF)    (Export to EndNotes)