ASABE Logo

Article Request Page ASABE Journal Article

Scale-Aware Pomegranate Yield Prediction Using UAV Imagery and Machine Learning

Haoyu Niu1, Dong Wang2, Reza Ehsani3, YangQuan Chen3,*


Published in Journal of the ASABE 66(5): 1331-1340 (doi: 10.13031/ja.15041). 2023 American Society of Agricultural and Biological Engineers.


1Electrical Engineering and Computer Science, University of California, Merced, California, USA.

2Water Management Research Unit, USDA ARS, Parlier, California, USA.

3Mechanical Engineering, University of California, Merced, California, USA.

*Correspondence: yqchen@ieee.org, ychen53@ucmerced.edu

The authors have paid for open access for this article. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License https://creative commons.org/licenses/by-nc-nd/4.0/

Submitted for review on 29 January 2022 as manuscript number ITSC 15041; approved for publication as a Research Article by Associate Editor Dr. Joshua Peschel and Community Editor Dr. Naiqian Zhang of the Information Technology, Sensors, & Control Systems Community of ASABE on 9 April 2023.

Mention of company or trade names is for description only and does not imply endorsement by the USDA. The USDA is an equal opportunity provider and employer.

Highlights

Abstract. Accurately estimating the yield is one of the most important research projects for orchard management. Growers need to estimate the yield of trees at an early stage to make smart decisions for field management. However, methods to predict the yield at the individual tree level are currently not available because of the complexity and variability of each tree. To improve the accuracy of yield prediction, the authors evaluate the performance of an unmanned aerial vehicle (UAV)-based remote sensing system and machine learning (ML) approaches for tree-level pomegranate yield estimation. A lightweight multispectral camera was mounted on the UAV platform to acquire high-resolution images. Eight features were extracted from the UAV images, including the normalized difference vegetation index (NDVI), the green normalized vegetation index (GNDVI), the red-edge normalized difference vegetation index (NDVIre), red-edge triangulated vegetation index (RTVIcore), individual tree canopy size, the modified triangular vegetation index (MTVI2), the chlorophyll index-green (CIg), and the chlorophyll index-rededge (CIre). Correlation coefficients (R2) were calculated between these vegetation indices and tree yield. Classic machine learning approaches were applied with the extracted features to predict the yield at the individual tree level. Results showed that the decision tree classifier had the best prediction performance, with an accuracy of 85%. The study demonstrated the potential of using UAV-based remote sensing methods, coupled with ML algorithms, for pomegranate yield estimation. Predicting the yield at the individual tree level will enable the stakeholders to manage the orchard at different scales, thus improving field management efficiency.

Keywords. Machine learning, Remote sensing, UAV, Vegetation index, Yield prediction.

Due to the recurring water shortages in California, many growers started growing crops that have drought resistance and high economic value to a certain degree, such as pomegranate (Zhang et al., 2017). Evidence suggests that pomegranate trees have strong adaptability to a wide range of soil conditions and climates (Wang et al., 2013; Holland et al., 2009). Some studies also show that pomegranate has great potential for improving human health and preventing deadly diseases such as pcancer (Sumner et al., 2005; Lansky and Newman, 2007). Because of the high value of the pomegranate trees, there are around 11,000 ha of pomegranate in California, and efficient pomegranate yield prediction is economically important in pomegranate production. Pomegranate yield estimation can provide critical information for stakeholders and help them make better decisions on field operations.

Determining the yield of field and woody crops is complicated and inaccurate due to the influence of multiple factors, including genotype and environmental conditions such as soil, irrigation management, and weather (Feng et al., 2020; Zaman et al., 2008). Thus, many researchers have been working on yield prediction using a plethora of approaches (Zhang et al., 2019; Yang et al., 2019; Stateras and Kalivas, 2020; Zhang et al., 2021; Feng et al., 2019). For example, Zhang et al. (2019) developed statistical models using the stochastic gradient boosting method for early and mid-season yield prediction of almond in the central valley of California. Multiple features were extracted from the remote sensing images, such as canopy cover percentage (CCP) and vegetation indices (VIs). Research results demonstrated the potential of automatic almond yield prediction at the individual orchard level. Yang et al. (2021) estimated the corn yield by using hyperspectral imagery and convolutional neural networks (CNNs). Results showed that the spectral and color image-based integrated model has a classification accuracy of 75% for corn yield prediction.

Figure 1. The pomegranate field was randomly divided into 16 equal blocks, with four replications, to test four irrigation levels. The irrigation volumes were 35%, 50%, 75%, and 100% of ETc, which was measured by the weighing lysimeter in the field.

Recently, unmanned aerial vehicles (UAVs) and lightweight payloads have been used as reliable remote sensing platforms for many researchers to monitor crop status temporally and spatially (Niu et al., 2021; Niu et al., 2020a; Turner et al., 2014; Swain et al., 2010). Equipped with lightweight payloads such as RGB cameras, multispectral cameras, and thermal cameras, UAV-based remote sensing systems can provide low-cost and high-resolution images for data analysis. For example, in Stateras and Kalivas (2020), Stateras et al. defined the geometry of olive tree configurations and developed a forecasting model of annual production in a non-linear olive grove. A digital terrain model (DTM) and a digital surface model (DSM) were generated with high-resolution multispectral imagery. Results showed that the forecasting model could predict the olive yield in kilograms per tree.

However, few studies have investigated the correlation between tree canopy characteristics and yield prediction at the individual tree level. Thus, this article aims to estimate the pomegranate tree yield with eight different tree canopy characteristics: the normalized difference vegetation index (NDVI), the green normalized vegetation index (GNDVI), the red-edge normalized difference vegetation index (NDVIre), the red-edge triangulated vegetation index (RTVIcore), the canopy size, the modified triangular vegetation index (MTVI2), the chlorophyll index-green (CIg), and the chlorophyll index-rededge (CIre).

The objectives of this article were: (1) to investigate the correlation between the pomegranate yield and eight different features extracted from UAV high-resolution images. (2) To establish a scale-aware yield prediction model using machine learning approaches. Estimating the yield with scale-aware models will help stakeholders make better decisions for field management at the block or orchard levels in the future.

materials and Methods

Experimental Field and Ground Data Collection

This study was conducted in a pomegranate research field at the USDA-ARS, San Joaquin Valley Agricultural Sciences Center (36.594 °N, 119.512 °W), Parlier, California, 93648, USA. The soil type is a Hanford fine sandy loam (coarse-loamy, mixed, thermic Typic Xerorthents). The San Joaquin Valley has a Mediterranean climate with hot and dry summers. Rainfall is insignificant during the growing season, and irrigation is the only source of water for pomegranate growth (Wang et al., 2013). Pomegranate (Punica granatum L., cv ‘Wonderful’) was planted in 2010 with a 5 m spacing between rows and a 2.75 m within-row tree spacing in a 1.3 ha field (Zhang et al., 2017).

The pomegranate field was randomly divided into 16 equal blocks with four replications to test four irrigation levels (fig. 1). The irrigation volumes were 35%, 50%, 75%, and 100% of cop evapotranspiration (ETc), measured by the weighing lysimeter in the field. There were five yield sampling trees in each block, 80 sampling trees in total, marked with red labels in figure 2.

UAV Platform and Imagery Data Acquisition

The UAV-based remote sensing system consisted of a UAV platform called “Hover”, and a multispectral camera (Rededge M, Micasense, Seattle, WA, USA) (fig. 3). The Rededge M has five different bands, which are Blue (B, 475 nm), Green (G, 560 nm), Red (R, 668 nm), Red-Edge (RedEdge, 717 nm), and Near Infrared (NIR, 840 nm). With a Downwelling Light Sensor (DLS), the Rededge M can measure the ambient light during a flight mission and record the light information in the metadata of the images. After the camera calibration, the information detected by the DLS can be used to correct lighting changes during a flight, such as changes in cloud cover during a UAV flight.

Figure 2. There were five sampling trees in each block, 80 yield sampling trees in total, marked with red labels.

The design of the flight missions was done using Mission Planner, an open-source software program developed by the ArduPilot Development Team and Community. The UAV was set to fly at a speed of 5 m/s, and the flight height was set at 60 m above ground level (AGL) with a resolution of 4 cm/pixel. The flight mission was conducted between 12 p.m. and 1 p.m. to ensure optimal lighting conditions. To successfully stitch UAV images together using Agisoft Metashape (Agisoft LLC, Russia), the image overlapping was designed at 75% forward and 70% sideward.

(a) The UAV.(b) Rededge.
Figure 3. The UAV-based remote sensing system consisted of a UAV platform, called “Hover”, and a multispectral camera (Rededge M, Micasense, Seattle, WA, USA). The Rededge M has five different bands, which are Blue (B, 475 nm), Green (G, 560 nm), Red (R, 668 nm), Red- Edge (RedEdge, 717 nm), and Near Infrared (NIR, 840 nm).

UAV Image Feature Extraction

The orthomosaic image was used to extract image features defined in table 1. Seven image features were extracted from the multispectral orthomosaic image acquired by the UAV platform. All the vegetation indices or features have been commonly used in monitoring plant health, nitrogen, biomass, and yield estimation (Dalezios et al., 2001; Yawata et al., 2019; Schwalbert et al., 2018; Magney et al., 2017; Din et al., 2017; Gong et al., 2018).

The Normalized Difference Vegetation Index (NDVI)

The NDVI has been commonly used for vegetation monitoring, such as water stress detection (Zhao et al., 2017a), crop yield assessment (Zhao et al., 2017b), and ET estimation (Niu et al., 2020b). The value of NDVI is a standardized method to measure healthy vegetation, allowing for the generation of an image displaying greenness (relative biomass). The NDVI takes advantage of the contrast of the characteristics of two bands, which are the chlorophyll pigment absorptions in the red band (R) and the high reflectivity of plant materials in the near-infrared band (NIR). When the NDVI is high, it indicates that the vegetation has a higher level of photosynthesis. The NDVI is usually calculated by

(1)

where NIR and R are the reflectance of near-infrared and red band, respectively.

The Green Normalized Difference Vegetation Index (GNDVI)

The GNDVI is commonly used to estimate photosynthetic activity and determine water and nitrogen uptake into the plant canopy (Yawata et al., 2019; Feng et al., 2020). The GNDVI is calculated by

(2)

where G stands for the reflectance of the green band.

The Red-Edge Normalized Difference Vegetation Index (NDVIre)

The NDVIre is a method for estimating vegetation health using the red-edge band. The chlorophyll concentration is usually higher at the late stages of plant growth; the NDVIre can then be used to map the within-field variability of nitrogen foliage to help better understand the fertilizer requirements of crops (Schwalbert et al., 2018; Narin and Abdikan, 2020). The NDVIre is calculated by

(3)

where RedEdge is the reflectance of the red-edge band.

Table 1. UAV image features used in this study.
FeaturesEquationsRelated TraitsReferences
NDVIYield, leaf chlorophyll
content, biomass

    (Dalezios et al., 2001; Ren et al., 2008; Feng et al., 2020)

GNDVIYield, leaf chlorophyll
content, biomass

    (Yawata et al., 2019; Feng et al., 2020)

NDVIreNitrogen, yield

    (Schwalbert et al., 2018; Narin and Abdikan, 2020)

RTVIcoreLeaf area index, biomass,
nitrogen

    (Magney et al., 2017; Khosravi and Alavipanah, 2019)

MTVI2Leaf chlorophyll content,

    (Din et al., 2017; Haboudane et al., 2004)

CIgYield, leaf chlorophyll content

    (Gong et al., 2018; Duan et al., 2019)

CIreYield, leaf chlorophyll content

    (Duan et al., 2021; Peng et al., 2019)

The Red-Edge Triangulated Vegetation Index (RTVIcore)

The RTVIcore is usually used for estimating the leaf area index and biomass (Magney et al., 2017; Khosravi and Alavipanah, 2019). It uses the reflectance in the NIR, RedEdge, and G spectral bands, calculated by

(4)

The Modified Triangular Vegetation Index (MTVI2)

The MTVI2 method usually detects the leaf chlorophyll content at the canopy scale, which is relatively insensitive to the leaf area index (Din et al., 2017). MTVI2 uses the reflectance in the G, R,and NIR bands, calculated by

(5)

The Green Chlorophyll Index (CIg)

The CIg is for estimating the chlorophyll content in leaves using the ratio of the reflectivity in the NIR and G bands (Gong et al., 2018), which is calculated by

(6)

The Red-Edge Chlorophyll Index (CIre)

The CIre is for estimating the chlorophyll content in leaves using the ratio of the reflectivity in the NIR and RedEdge bands (Gong et al., 2018), which is calculated by

(7)

The Machine Learning Methods

For simplicity, the authors processed the yield problem in this article as a classification problem. Based on the individual tree level yield, all the trees are classified into four different yield classes. Then, several ML classifiers were adopted to evaluate the performance of pomegranate yield estimation, such as “Random Forest” (Breiman, 2001), “AdaBoost” (Kingma and Ba, 2014), “Nearest Neighbors” (Goldberger et al., 2004), and “Decision Tree” (Loh, 2011).

The “Random Forest'” classifier is a meta-estimator that fits several decision tree classifiers on various sub-samples of the dataset and adopts averaging to improve predictive accuracy and control overfitting. An “AdaBoost” classifier is also a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset, but where the weights of incorrectly classified instances are adjusted such that subsequent classifiers focus more on complex cases.

The “Nearest Neighbors” method is to apply a predefined number of training samples closest in distance to the new point and predict the label from these. The samples can be constant k-nearest neighbor learning or vary based on the local density of points (radius-based neighbor learning). Despite its simplicity, the nearest neighbor’s method has been successfully applied for many research problems, such as handwritten digit classification. As a non-parametric method, it is often successful in classification situations where the decision boundary is very irregular.

“Decision Trees” are also non-parametric supervised learning methods commonly adopted for classification problems. The objective is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data features. A tree can be seen as a piecewise constant approximation. “Decision Trees” usually use a white box model, which means the explanation for the condition is easily explained by Boolean logic if a given situation is observable in a model. In contrast, results may be more challenging to interpret for a black-box model, such as an artificial neural network.

Results and Discussion

The Pomegranate Yield Performance in 2019

The pomegranate fruit was harvested from 80 sampling trees in 2019. As mentioned earlier, there were four different irrigation levels in the field: 35%, 50%, 75%, and 100% of ET. The total fruit weight per tree (kg) and the boxplot for each irrigation level were made (fig. 4). For the 35% irrigation treatment, the total fruit weight per tree was 23.92 kg, which produced the lowest yield. For the 50% irrigation treatment, the total fruit weight per tree was 27.63 kg. For 75% and 100% irrigation treatment, the total fruit weight per tree was 29.84 kg and 34.85 kg, respectively. The pomegranate yield performance at the USDA is consistent with previous research work (Zhang et al., 2017). Since the authors have the yield data for each sampling tree, machine learning algorithms were used for individual tree level yield estimation with the eight image features mentioned earlier.

Figure 4. The pomegranate yield performance at the individual tree level in 2019. For the 35% irrigation treatment, the total fruit weight per tree was 23.92 kg, which produced the lowest yield. For the 50% irrigation treatment, the total fruit weight per tree was 27.63 kg. For 75% and 100% irrigation treatment, the total fruit weight per tree was 29.84 kg and 34.85 kg, respectively.

The Correlation Between the Image Features and Pomegranate Yield

Before the vegetation indices were used as input features for ML algorithms, the authors first investigated the correlation (R2) between the vegetation index and the pomegranate yield (fig. 5). Each dot represented a mean value of 20 sampling trees. According to the research results, the NDVIre and CIre had relatively higher R2, which were 0.6963 and 0.6772, respectively. Research results showed that the NDVI and the pomegranate yield had an R2 of 0.6273. The GNDVI and the yield had an R2 of 0.5166. The MTVI2 and CIg had R2 of 0.4293 and 0.5059, respectively. The RTVIcore had the lowest R2 of 0.1216. The canopy size had an R2 of 0.6192. These findings emphasized the importance of yield estimation using vegetation indices. The combination of different vegetation indices can enhance a certain characteristic or detail of the trees. Several researchers reported that vegetation indices could be used for yield estimation (Dalezios et al., 2001; Ren et al., 2008; Yawata et al., 2019; Schwalbert et al., 2018; Magney et al., 2017; Din et al., 2017; Duan et al., 2019, 2021; Feng et al., 2020). The performance of ML algorithms on yield prediction is discussed in the following section.

The ML Algorithm Performance on Yield Estimation

The pomegranate yield data (80 sampling trees) was distributed as 75% for training and 25% for testing using the train_ test_split method. Considering the dataset was relatively small, the authors used K-fold cross-validation, splitting the training dataset into K folds, then making predictions and evaluating each fold using an ML model trained on the remaining folds (Geron, 2019). The classes were defined as low yield and high yield for yield prediction based on a threshold value of 25 kg per tree. For evaluating the trained models, a confusion matrix was used to compare the performances of different classifiers. The confusion matrix was a summary of prediction results on a classification problem. Correct and incorrect predictions were tallied with count values and divided into classes. The confusion matrix provided insight not only into the errors being made by a classifier but, more importantly, into the types of errors that were being made. “True label” meant the ground truth of the yield. “Predicted label” identified the individual tree yield predicted by the trained model.

The trained ML classifiers had distinct test performance for individual tree level yield prediction. The “Decision Trees” classifier had the highest accuracy of 0.85. Table 2 and figure 6 show the details of the “Decision Trees” method, a non-parametric supervised learning methods commonly adopted for classification problems. The objective was to create an ML model that predicted the value of a target variable by learning simple decision rules inferred from the data features (fig. 6). A tree can be seen as a piecewise constant approximation. “Decision Trees” usually uses a white box model, which means the explanation for the condition is easily explained by Boolean logic if a given situation is observable in a model. As shown in figure 6, the “Decision Trees” ML model started at the root node; if the NDVIre value were less than 0.334, the prediction process would move to the leaf child node. In this case, the model would predict that the input was a low-yield pomegranate tree. A node’s gini attribute measures its impurity: a node is “pure (gini = 0)” if all the training instances it applies are from the same class.

For the other classifiers' test performance, the accuracy of the k-nearest neighbor was 0.8. “Support Vector Classification (SVC)” had an accuracy of 0.7. The “Random Forest” had a test accuracy of 0.65. The “AdaBoost”, “Gaussian Process”, and “Gaussian Naive Bayes” had an accuracy of 0.8, 0.75, and 0.6, respectively. The “Quadratic Discriminant Analysis (QDA)” also had a prediction accuracy of 0.8 (table 3 and fig. 7).

Conclusion

The aim of the present research is individual tree level yield prediction in the pomegranate field using a UAV-based remote sensing method. The authors collected yield data and calculated vegetation indices derived from the high-resolution UAV imagery. Then, machine learning algorithms were adopted for yield prediction classification. The research results showed that the best classification accuracy yield

Figure 5. The correlation between the vegetation indices and yield. NDVIre and CIre had relatively higher R2, which were 0.6963 and 0.6772, respectively. Research results showed that the NDVI and the pomegranate yield had a R2 of 0.6273. The GNDVI and the yield had a R2 of 0.5166. The MTVI2 and CIg had R2 of 0.4293 and 0.5059, respectively. The RTVIcore had the lowest R2 of 0.1216. The canopy size had an R2 of 0.6192.
Figure 6. The “Decision Trees” method training process. The “Decision Trees” usually uses a white box model, which means the explanation for the condition is easily explained by Boolean logic if a given situation is observable in a model. As shown here, the “Decision Trees” ML model started at the root node; if the NDVIre value were less than 0.334, the prediction process would move to the leaf child node. In this case, the model would predict that the input was a low-yield pomegranate tree. A node’s gini attribute measures its impurity: a node is “pure (gini = 0)” if all the training instances it applies are from the same class.
Table 2. The “Decision Tree” performance on yield prediction. “NA” stands for “Not Available”.
Yield PredictionPrecisionRecallF1-Score
Low yield0.920.850.88
High yield0.750.860.80
Accuracy NANA0.85
Macro avg0.830.850.84
Weighted avg0.860.850.85

Table 3. The performance of ML methods on yield prediction.
ML MethodsPrediction Accuracy
“Decision Trees”0.85
k-nearest neighbor0.80
“Support Vector Classification”0.70
“Random Forest”0.65
“AdaBoost”0.80
“Gaussian Process”0.75
“Gaussian Naive Bayes”0.60

was 85% when the “Decision Trees” method was adopted. For the other ML models’ test performance, the accuracy of the k-nearest neighbor was 0.8. “Support Vector Classification (SVC)” had an accuracy of 0.7. The “Random Forest” had a test accuracy of 0.65. The “AdaBoost”, “Gaussian Process”, and “Gaussian Naive Bayes” had an accuracy of 0.8, 0.75, and 0.6, respectively. The “Quadratic Discriminant Analysis (QDA)” also had a prediction accuracy of 0.8. The pomegranate yield information could be reflected by vegetation index data. The research results supported the idea that vegetation indices could be used for yield estimation. Furthermore, the findings of this research provided insights for scale-aware yield prediction using phenotyping and machine learning technology.

Figure 7. The comparison of the eight different ML classifiers on individual tree level yield prediction. A confusion matrix was a summary of prediction results on a classification problem. Correct and incorrect predictions were tallied with count values and divided into classes. The confusion matrix provided insight not only into the errors being made by a classifier but, more importantly, into the types of errors that were being made. “True label” meant the ground truth of the yield. “Predicted label” identified the individual tree yield predicted by the trained model. The value 0 meant low yield; value 1 meant high yield.

Acknowledgments

Thanks go to Stella Zambrzuski for collecting field measurements and yield data. Thanks go to Dong Sen Yan, Joshua Ahmed, and Christopher Currier for flying drones. For reproducibility, the authors shared the demo code and dataset on GitHub: https://github.com/niuhaoyu16/scale-aware-yield-prediction. Feel free to contact us if you have any questions.

References

Breiman, L. (2001). Random forests. Mach. Learn., 45, 5-32. https://doi.org/10.1023/A:1010933404324

Dalezios, N. R., Domenikiotis, C., Loukas, A., Tzortzios, S. T., & Kalaitzidis, C. (2001). Cotton yield estimation based on NOAA/AVHRR produced NDVI. Phys. Chem. Earth Part B, 26(3), 247-251. https://doi.org/10.1016/S1464-1909(00)00247-1

Din, M., Zheng, W., Rashid, M., Wang, S., & Shi, Z. (2017). Evaluating hyperspectral vegetation indices for leaf area index estimation of Oryza sativa L. at diverse phenological stages. Front. Plant Sci., 8, 820. https://doi.org/10.3389/fpls.2017.00820

Duan, B., Fang, S., Gong, Y., Peng, Y., Wu, X., & Zhu, R. (2021). Remote estimation of grain yield based on UAV data in different rice cultivars under contrasting climatic zone. Field Crops Res., 267, 108148. https://doi.org/10.1016/j.fcr.2021.108148

Duan, B., Fang, S., Zhu, R., Wu, X., Wang, S., Gong, Y., & Peng, Y. (2019). Remote estimation of rice yield with unmanned aerial vehicle (UAV) data and spectral mixture analysis. Front. Plant Sci., 10, 204. https://doi.org/10.3389/fpls.2019.00204

Feng, A., Zhang, M., Sudduth, K. A., Vories, E. D., & Zhou, J. (2019). Cotton yield estimation from UAV-based plant height. Trans. ASABE, 62(2), 393-404. https://doi.org/10.13031/trans.13067

Feng, A., Zhou, J., Vories, E. D., Sudduth, K. A., & Zhang, M. (2020). Yield estimation in cotton using UAV-based multi-sensor imagery. Biosyst. Eng., 193, 101-114. https://doi.org/10.1016/j.biosystemseng.2020.02.014

Geron, A. (2019). Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. Sepastopol, CA: O’Reilly Media.

Goldberger, J., Hinton, G. E., Roweis, S., & Salakhutdinov, R. R. (2004). Neighbourhood components analysis. In L. Saul, Y. Weiss, & L. Bottou (Ed.), Proc. Advances in Neural Information Processing Systems 17.

Gong, Y., Duan, B., Fang, S., Zhu, R., Wu, X., Ma, Y., & Peng, Y. (2018). Remote estimation of rapeseed yield with unmanned aerial vehicle (UAV) imaging and spectral mixture analysis. Plant Methods, 14. https://doi.org/10.1186/s13007-018-0338-z

Haboudane, D., Miller, J. R., Pattey, E., Zarco-Tejada, P. J., & Strachan, I. B. (2004). Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ., 90(3), 337-352. https://doi.org/10.1016/j.rse.2003.12.013

Holland, D., Hatib, K., & Bar-Ya’akov, I. (2009). Chapter 2 - Pomegranate: Botany, horticulture, breeding. In J. Janick (Ed.), Horticultural reviews (Vol. 35, pp. 127-191). Hoboken, NJ: Wiley. https://doi.org/10.1002/9780470593776.ch2

Khosravi, I., & Alavipanah, S. K. (2019). A random forest-based framework for crop mapping using temporal, spectral, textural and polarimetric observations. Int. J. Remote Sens., 40(18), 7221-7251. https://doi.org/10.1080/01431161.2019.1601285

Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980. https://doi.org/10.48550/arXiv.1412.6980

Lansky, E. P., & Newman, R. A. (2007). Punica granatum (pomegranate) and its potential for prevention and treatment of inflammation and cancer. J. Ethnopharmacol., 109(2), 177-206. https://doi.org/10.1016/j.jep.2006.09.006

Loh, W.-Y. (2011). Classification and regression trees. Wiley Interdiscip. Rev: Data Min. Knowl. Discov., 1(1), 14-23. https://doi.org/10.1002/widm.8

Magney, T. S., Eitel, J. U., & Vierling, L. A. (2017). Mapping wheat nitrogen uptake from RapidEye vegetation indices. Precis. Agric., 18, 429-451. https://doi.org/10.1007/s11119-016-9463-8

Narin, O. G., & Abdikan, S. (2020). Monitoring of phenological stage and yield estimation of sunflower plant using Sentinel-2 satellite images. Geocarto Int., 37(5), 1-15. https://doi.org/10.1080/10106049.2020.1765886

Niu, H., Hollenbeck, D., Zhao, T., Wang, D., & Chen, Y. (2020a). Evapotranspiration estimation with small UAVs in precision agriculture. Sensors, 20(22), 6427. https://doi.org/10.3390/s20226427

Niu, H., Wang, D., & Chen, Y. (2020b). Estimating actual crop evapotranspiration using deep stochastic configuration networks model and UAV-based crop coefficients in a pomegranate orchard. Proc. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V.11414, p. 114140C. International Society for Optics and Photonics. https://doi.org/10.1117/12.2558221

Niu, H., Zhao, T., Wei, J., Wang, D., & Chen, Y. (2021). Reliable tree-level evapotranspiration estimation of pomegranate trees using lysimeter and UAV multispectral imagery. Proc. 2021 IEEE Conference on Technologies for Sustainability (SusTech) (pp. 1-6). IEEE. https://doi.org/10.1109/SusTech51236.2021.9467413

Peng, Y., Zhu, T., Li, Y., Dai, C., Fang, S., Gong, Y.,... Liu, K. (2019). Remote prediction of yield based on LAI estimation in oilseed rape under different planting methods and nitrogen fertilizer applications. Agric. For. Meteorol., 271, 116-125. https://doi.org/10.1016/j.agrformet.2019.02.032

Ren, J., Chen, Z., Zhou, Q., & Tang, H. (2008). Regional yield estimation for winter wheat with MODIS-NDVI data in Shandong, China. Int. J. Appl. Earth Obs. Geoinf., 10(4), 403-413. https://doi.org/10.1016/j.jag.2007.11.003

Schwalbert, R. A., Amado, T. J., Nieto, L., Varela, S., Corassa, G. M., Horbe, T. A.,... Ciampitti, I. A. (2018). Forecasting maize yield at field scale based on high-resolution satellite imagery. Biosyst. Eng., 171, 179-192. https://doi.org/10.1016/j.biosystemseng.2018.04.020

Stateras, D., & Kalivas, D. (2020). Assessment of olive tree canopy characteristics and yield forecast model using high resolution UAV imagery. Agriculture, 10(9), 385. https://doi.org/10.3390/agriculture10090385

Sumner, M. D., Elliott-Eller, M., Weidner, G., Daubenmier, J. J., Chew, M. H., Marlin, R.,... Ornish, D. (2005). Effects of pomegranate juice consumption on myocardial perfusion in patients with coronary heart disease. Am. J. Cardiol., 96(6), 810-814. https://doi.org/10.1016/j.amjcard.2005.05.026

Swain, K. C., Thomson, S. J., & Jayasuriya, H. P. (2010). Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop. Trans. ASABE, 53(1), 21-27. https://doi.org/10.13031/2013.29493

Turner, D., Lucieer, A., Malenovsky, Z., King, D. H., & Robinson, S. A. (2014). Spatial co-registration of ultra-high resolution visible, multispectral and thermal images acquired with a micro-UAV over Antarctic moss beds. Remote Sens., 6(5), 4003-4024. https://doi.org/10.3390/rs6054003

Wang, D., Ayars, J. E., Tirado-Corbala, R., Makus, D., Phene, C. J., & Phene, R. (2013). Water and nitrogen management of young and maturing pomegranate trees. Proc. III Int. Symp. on Pomegranate and Minor Mediterranean Fruits 1089, (pp. 395-401). https://doi.org/10.17660/ActaHortic.2015.1089.53

Yang, Q., Shi, L., Han, J., Zha, Y., & Zhu, P. (2019). Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res., 235, 142-153. https://doi.org/10.1016/j.fcr.2019.02.022

Yang, W., Nigon, T., Hao, Z., Paiao, G. D., Fernandez, F. G., Mulla, D., & Yang, C. (2021). Estimation of corn yield based on hyperspectral imagery and convolutional neural network. Comput. Electron. Agric., 184, 106092. https://doi.org/10.1016/j.compag.2021.106092

Yawata, K., Yamamoto, T., Hashimoto, N., Ishida, R., & Yoshikawa, H. (2019). Mixed model estimation of rice yield based on NDVI and GNDVI using a satellite image. Proc. Remote Sensing for Agriculture, Ecosystems, and Hydrology XXI.11149, p. 1114918. International Society for Optics and Photonics. https://doi.org/10.1117/12.2532108

Zaman, Q. U., Schumann, A. W., Percival, D. C., & Gordon, R. J. (2008). Estimation of wild blueberry fruit yield using digital color photography. Trans. ASABE, 51(5), 1539-1544. https://doi.org/10.13031/2013.25302

Zhang, H., Wang, D., Ayars, J. E., & Phene, C. J. (2017). Biophysical response of young pomegranate trees to surface and sub-surface drip irrigation and deficit irrigation. Irrig. Sci., 35, 425-435. https://doi.org/10.1007/s00271-017-0551-y

Zhang, X., Toudeshki, A., Ehsani, R., Li, H., Zhang, W., & Ma, R. (2021). Yield estimation of citrus fruit using rapid image processing in natural background. Smart Agric. Technol., 2, 100027. https://doi.org/10.1016/j.atech.2021.100027

Zhang, Z., Jin, Y., Chen, B., & Brown, P. (2019). California almond yield prediction at the orchard level with a machine learning approach. Front. Plant Sci., 10, 809. https://doi.org/10.3389/fpls.2019.00809

Zhao, T., Doll, D., & Chen, Y. (2017a). Better almond water stress monitoring using fractional-order moments of non-normalized difference vegetation index, ASABE paper no. 1701593. Proc. 2017 ASABE Annu. Int. Meeting. St. Joseph, MI: ASABE. https://doi.org/10.13031/aim.201701593

Zhao, T., Niu, H., Anderson, A., Chen, Y., & Viers, J. (2018). A detailed study on accuracy of uncooled thermal cameras by exploring the data collection workflow. Proc. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III.10664, p. 106640F. International Society for Optics and Photonics. https://doi.org/10.1117/12.2305217

Zhao, T., Wang, Z., Yang, Q., & Chen, Y. (2017b). Melon yield prediction using small unmanned aerial vehicles. Proc. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II (p. 1021808). International Society for Optics and Photonics. https://doi.org/10.1117/12.2262412