Click on “Download PDF” for the PDF version or on the title for the HTML version.


If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.

Tree Canopy Differentiation Using Instance-aware Semantic Segmentation

Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org

Citation:  2018 ASABE Annual International Meeting  1801699.(doi:10.13031/aim.201801699)
Authors:   Tiebiao Zhao, Haoyu Niu, Erick de la Rosa, David Doll, Dong Wang, YangQuan Chen
Keywords:   Unmanned aerial vehicles, Canopy classification, Canopy segmentation, Instance-aware semantic segmentation

Abstract. The development of sensors and cameras has made it convenient to obtain images with higher resolution at a very low cost for precision agriculture applications. This has led to improved high-throughput phenotyping. Within perennial crops, canopy size can help estimate water use, yield and pesticide requirements. Typically, one of the widely used methods is to estimate the canopy size by 3D point-cloud data collected using Light Detection and Ranging or LIDAR. However, LIDAR‘s usage is limited because of the high cost and complicated post-processing procedures. There is also other method which estimates canopy size by using 2D images. Canopy classification and detection can be based on artificial features, such as threshold determination, shape and compactness. These features are very specific, however, which renders the approach to small changes in the objects, backgrounds or camera settings. To overcome these limitations, we proposed to classify and differentiate canopy pixels of pomegranate trees by using a fully convolutional neural network, called instance-aware semantic segmentation. Not only can it classify canopy pixels from pixels of soil and grass, but also differentiate canopy pixels between neighboring trees. Tests on validation set showed that its precision reached above 90% and it is robust to changes in camera setting, lighting condition, canopy development and changing background.

(Download PDF)    (Export to EndNotes)