<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="2.3" xml:lang="EN">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Plant Sci.</journal-id>
<journal-title>Frontiers in Plant Science</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Plant Sci.</abbrev-journal-title>
<issn pub-type="epub">1664-462X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpls.2023.1217448</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Plant Science</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Improving grain yield prediction through fusion of multi-temporal spectral features and agronomic trait parameters derived from UAV imagery</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Zhou</surname>
<given-names>Hongkui</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="author-notes" rid="fn001">
<sup>*</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/2292552"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Yang</surname>
<given-names>Jianhua</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Lou</surname>
<given-names>Weidong</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Sheng</surname>
<given-names>Li</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Li</surname>
<given-names>Dong</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/200399"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Hu</surname>
<given-names>Hao</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="author-notes" rid="fn001">
<sup>*</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<sup>1</sup>
<institution>Institute of Digital Agriculture, Zhejiang Academy of Agricultural Sciences</institution>, <addr-line>Hangzhou</addr-line>, <country>China</country>
</aff>
<aff id="aff2">
<sup>2</sup>
<institution>Academy of Eco-civilization Development for Jing-Jin-Ji Megalopolis, Tianjin Normal University</institution>, <addr-line>Tianjin</addr-line>, <country>China</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>Edited by: Xiaohe Gu, Beijing Research Center for Information Technology in Agriculture, China</p>
</fn>
<fn fn-type="edited-by">
<p>Reviewed by: Yahui Guo, Central China Normal University, China; Zhanyou Xu, United States Department of Agriculture, United States; Yuehong Chen, Hohai University, China</p>
</fn>
<fn fn-type="corresp" id="fn001">
<p>*Correspondence: Hongkui Zhou, <email xlink:href="mailto:zhouhongkui@zaas.ac.cn">zhouhongkui@zaas.ac.cn</email>; Hao Hu, <email xlink:href="mailto:huh@zaas.ac.cn">huh@zaas.ac.cn</email>
</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>16</day>
<month>10</month>
<year>2023</year>
</pub-date>
<pub-date pub-type="collection">
<year>2023</year>
</pub-date>
<volume>14</volume>
<elocation-id>1217448</elocation-id>
<history>
<date date-type="received">
<day>05</day>
<month>05</month>
<year>2023</year>
</date>
<date date-type="accepted">
<day>27</day>
<month>09</month>
<year>2023</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2023 Zhou, Yang, Lou, Sheng, Li and Hu</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Zhou, Yang, Lou, Sheng, Li and Hu</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>Rapid and accurate prediction of crop yield is particularly important for ensuring national and regional food security and guiding the formulation of agricultural and rural development plans. Due to unmanned aerial vehicles&#x2019; ultra-high spatial resolution, low cost, and flexibility, they are widely used in field-scale crop yield prediction. Most current studies used the spectral features of crops, especially vegetation or color indices, to predict crop yield. Agronomic trait parameters have gradually attracted the attention of researchers for use in the yield prediction in recent years. In this study, the advantages of multispectral and RGB images were comprehensively used and combined with crop spectral features and agronomic trait parameters (i.e., canopy height, coverage, and volume) to predict the crop yield, and the effects of agronomic trait parameters on yield prediction were investigated. The results showed that compared with the yield prediction using spectral features, the addition of agronomic trait parameters effectively improved the yield prediction accuracy. The best feature combination was the canopy height (CH), fractional vegetation cover (FVC), normalized difference red-edge index (NDVI_RE), and enhanced vegetation index (EVI). The yield prediction error was 8.34%, with an R<sup>2</sup> of 0.95. The prediction accuracies were notably greater in the stages of jointing, booting, heading, and early grain-filling compared to later stages of growth, with the heading stage displaying the highest accuracy in yield prediction. The prediction results based on the features of multiple growth stages were better than those based on a single stage. The yield prediction across different cultivars was weaker than that of the same cultivar. Nevertheless, the combination of agronomic trait parameters and spectral indices improved the prediction among cultivars to some extent.</p>
</abstract>
<kwd-group>
<kwd>yield prediction</kwd>
<kwd>agronomic trait</kwd>
<kwd>remote sensing</kwd>
<kwd>UAV</kwd>
<kwd>machine learning</kwd>
</kwd-group>
<contract-num rid="cn001">41907394, 41901360</contract-num>
<contract-num rid="cn002">LY21D010004</contract-num>
<contract-sponsor id="cn001">National Natural Science Foundation of China<named-content content-type="fundref-id">10.13039/501100001809</named-content>
</contract-sponsor>
<contract-sponsor id="cn002">Natural Science Foundation of Zhejiang Province<named-content content-type="fundref-id">10.13039/501100004731</named-content>
</contract-sponsor>
<counts>
<fig-count count="9"/>
<table-count count="6"/>
<equation-count count="7"/>
<ref-count count="68"/>
<page-count count="17"/>
<word-count count="8176"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-in-acceptance</meta-name>
<meta-value>Technical Advances in Plant Science</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec id="s1" sec-type="intro">
<label>1</label>
<title>Introduction</title>
<p>The growing global population has led to a rising demand for food. Increasing global climate change has caused frequent occurrences of natural disasters, posing a huge threat to agricultural production, and it has been demonstrated that climate change has a substantial effect on food security (<xref ref-type="bibr" rid="B41">Mora et&#xa0;al., 2018</xref>; <xref ref-type="bibr" rid="B50">Su et&#xa0;al., 2018</xref>; <xref ref-type="bibr" rid="B40">Misiou and Koutsoumanis, 2022</xref>). Comprehensive, timely, and accurate grain yield prediction of major crops is also of great significance for optimizing the structure of the agricultural industry and formulating rural development plans. Therefore, whether in the context of current climate change or macro policies, it is quite necessary to quickly and accurately estimate crop yields to ensure food security and agricultural and rural development.</p>
<p>Traditionally, crop yield prediction has mainly relied on field surveys, which require much time, people, and resources. Currently, crop yield prediction methods include statistical regression models, crop model simulations, and remote sensing (RS)-based models. The deficiency of statistical regression models is that the yield prediction accuracy is related to the crop cultivars, region, and growth period, and the models are not universal (<xref ref-type="bibr" rid="B9">Fang et&#xa0;al., 2011</xref>; <xref ref-type="bibr" rid="B20">Huang et&#xa0;al., 2015</xref>). The main superiority of the crop model simulation method is that it can mechanically simulate the entire process of crop growth and biomass accumulation. However, the accuracy of the production simulation depends on the model structure and the accuracy of the model parameters, and there are many parameters required (<xref ref-type="bibr" rid="B1">Asseng et&#xa0;al., 2013</xref>; <xref ref-type="bibr" rid="B7">Dong et&#xa0;al., 2020</xref>). Therefore, it is still challenging to accurately estimate production on a large scale. RS technology has developed rapidly in recent years, and it has been widely used in crop yield prediction due to its advantages of large coverage area, low cost, and high efficiency (<xref ref-type="bibr" rid="B46">Sagan et&#xa0;al., 2021</xref>).</p>
<p>Currently, many studies have used satellite RS images to predict the crop yield and have achieved a good estimation accuracy. These studies involved a variety of methods (e.g., statistical regression, machine learning, and data assimilation), various crop types (e.g., rice, wheat, cotton, and potatoes), and different RS data (from low to high resolution, from multispectral (MS) to hyperspectral (HS) bands) (<xref ref-type="bibr" rid="B31">Lobell et&#xa0;al., 2015</xref>; <xref ref-type="bibr" rid="B27">Lambert et&#xa0;al., 2018</xref>; <xref ref-type="bibr" rid="B64">Yang et&#xa0;al., 2019</xref>; <xref ref-type="bibr" rid="B10">Filippi et&#xa0;al., 2019</xref>; <xref ref-type="bibr" rid="B47">Sakamoto, 2020</xref>; <xref ref-type="bibr" rid="B53">van Klompenburg et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B57">Weiss et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B4">Cao et&#xa0;al., 2021</xref>; <xref ref-type="bibr" rid="B46">Sagan et&#xa0;al., 2021</xref>; <xref ref-type="bibr" rid="B23">Jeong et&#xa0;al., 2022</xref>). With the continuous development of precision agriculture, the requirements for crop yield prediction in terms of spatial resolution and accuracy have increased (<xref ref-type="bibr" rid="B33">Maes and Steppe, 2019</xref>). Satellite imagery still has the problem of low spatial resolution for farmland with a small area and complex terrain. In addition, it is easily affected by rainy weather, resulting in poor image continuity. Therefore, due to the advantages of ultra-high spatial resolution and flexibility, unmanned aerial vehicle (UAV) RS platforms have been significantly improved in many agricultural applications, such as crop yield prediction, field management, crop phenology identification, and chlorophyll estimation in recent years (<xref ref-type="bibr" rid="B37">Maresma et&#xa0;al., 2016</xref>; <xref ref-type="bibr" rid="B33">Maes and Steppe, 2019</xref>; <xref ref-type="bibr" rid="B29">Li et&#xa0;al., 2021</xref>; <xref ref-type="bibr" rid="B16">Guo et&#xa0;al., 2022</xref>; <xref ref-type="bibr" rid="B52">Tanabe et&#xa0;al., 2023</xref>).</p>
<p>The main idea of many existing studies is to use digital cameras and MS and/or HS sensors carried by UAVs to obtain or estimate various parameters related to the crop yield and then to apply statistical or machine learning techniques to predict the crop yield (<xref ref-type="bibr" rid="B53">van Klompenburg et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B46">Sagan et&#xa0;al., 2021</xref>). Nonetheless, the accuracy and robustness of the crop yield prediction still need to be further improved. The accuracy and robustness can be further improved by (1) optimizing the feature parameter space of the crop yield prediction and selecting more suitable features; (2) improving crop yield prediction algorithms; and (3) combining other yield prediction methods (e.g., crop model simulations). This study mainly focused on the first method. Through a review of the existing literature, it was found that most studies have used the spectral features of crops, especially vegetation indices or color indices to predict crop yields. Vegetation indices exhibit a strong correlation with crop growth and development when the coverage is low. However, they are prone to saturation when the canopy of the plant is closed, at which time they become less sensitive to the plant growth. In addition, the vertical growth information which is strongly linked to the formation of crop biomass and yield, poses a challenge for vegetation indices to detect accurately during the middle and later stages of crop growth (<xref ref-type="bibr" rid="B65">Yue and Tian, 2020</xref>). Therefore, in addition to spectral features, it is necessary to improve the feature space for yield prediction and to select optimal and available agronomic RS features that are closely related to the yield formation.</p>
<p>Agronomic trait parameters are closely linked with crop growth and yield formation, so they are considered to have great potential for improving the yield prediction capability. Many agronomic trait parameters involve all aspects of the crop growth process, and they can also be acquired through RS techniques. The agronomic trait parameters in this study specifically refer to those obtained using RS techniques. Choosing parameters related to crop yield and relatively independent of crop growth is an important principle for feature selection. Many RS-based agronomic biochemical/biophysical parameters (e.g., the chlorophyll content, nitrogen content, and leaf area index) are usually obtained using the relationship with vegetation indices, and hence, they are autocorrelated with the spectral features. The fractional vegetation cover (FVC) is crucial parameter that describes the spatial pattern of vegetation types, and it is closely relevant to the crop planting density, growth stage, and health status (<xref ref-type="bibr" rid="B11">Gao et&#xa0;al., 2020</xref>). The canopy height (CH) and canopy volume (VOL) can reflect the vertical growth of crops and can characterize the crop structure information (<xref ref-type="bibr" rid="B36">Maimaitijiang et&#xa0;al., 2019</xref>; <xref ref-type="bibr" rid="B67">Zhang et&#xa0;al., 2021</xref>; <xref ref-type="bibr" rid="B49">Shu et&#xa0;al., 2022</xref>). The three indicators mentioned above are all agronomic structural trait parameters that are closely related to the yield, and all three can be obtained using a UAV. In addition, compared with spectral or color information, they are relatively independent data sources. The FVC can be calculated using the image classification method, while the CH and VOL are extracted from dense photogrammetric point cloud information obtained by a UAV equipped with a high-definition camera. In addition, the texture is also a frequently used RS feature that can provide insight into the spatial variations within the vegetation canopy to a certain extent. Currently, the abovementioned metrics have been applied for predicting nitrogen content, crop biomass, and crop yield. Nevertheless, there is an ongoing need for further validation on how to better integrate multi-temporal spectral features with agronomic trait parameters to enhance the accuracy of yield predictions. Additionally, the adaptability of the constructed models across different crop cultivars still requires further explored.</p>
<p>Machine learning has become a key approach to predict crop yield using UAV-based RS data (<xref ref-type="bibr" rid="B48">Shahhosseini et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B53">van Klompenburg et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B56">Wang et&#xa0;al., 2021</xref>; <xref ref-type="bibr" rid="B62">Xu et&#xa0;al., 2021</xref>). The random forest (RF) is a widely used machine learning algorithm with many advantages (<xref ref-type="bibr" rid="B3">Breiman, 2001</xref>; <xref ref-type="bibr" rid="B30">Li et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B19">He et&#xa0;al., 2021</xref>). Firstly, it is an ensemble learning algorithm that achieves predictions by constructing multiple decision trees, each with a degree of independence. As a result, it exhibits robustness to noise, outliers, and missing values, making it highly reliable. Secondly, RF introduces a bootstrap sampling mechanism, which enhances the model&#x2019;s generalization ability while mitigating the risk of overfitting. Furthermore, it is relatively easy to use and does not require extensive hyperparameter tuning. Importantly, RF has been proven to perform well in many studies (<xref ref-type="bibr" rid="B30">Li et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B38">Marques Ramos et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B53">van Klompenburg et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B55">Wan et&#xa0;al., 2020</xref>). Therefore, we used the RF algorithm as the core algorithm and combined it with spectral features, texture features, and agronomic trait parameters based on UAV images to predict the crop yield. The specific research goals of this study were (1) to predict the crop yield and compare the performances of the spectral, texture, and agronomic trait parameters; (2) to evaluate the impacts of the parameters in the different growth periods on the yield prediction results; and (3) to investigate the robustness of models of different cultivars and to evaluate whether the incorporation of agronomic parameters can enhance the predictive capacity of the crop yield model for various cultivars. This study focuses on wheat as its research crop, aiming to estimate its yield. It should be noted that in this context, &#x2018;yield&#x2019; specifically refers to grain yield rather than biomass yield.</p>
</sec>
<sec id="s2" sec-type="materials|methods">
<label>2</label>
<title>Materials and methods</title>
<sec id="s2_1">
<label>2.1</label>
<title>Experimental design</title>
<p>The study was conducted at the experimental site situated in Ningbo City, Zhejiang Province, with geographic coordinates of 29&#xb0;18&#x2032;N and 121&#xb0;34&#x2032;E. The study area has a subtropical monsoon climate characterized by clear seasonal variations. The average temperatures in summer and winter are approximately 27&#xb0;C and 6&#xb0;C, respectively, resulting in an annual average temperature of approximately 16&#xb0;C. The average annual rainfall is approximately 1700 mm. In this study, winter wheat was selected as the research crop, which is one of the most important crops in the study area. The experimental period was the 2019&#x2013;2020 winter wheat growing season (planting in November 2019 to harvest in May 2020). The experimental design is shown in <xref ref-type="fig" rid="f1">
<bold>Figure&#xa0;1</bold>
</xref>. Two main wheat cultivars (JYM 1 and YM 20) were used. For each cultivar, four nitrogen fertilizer treatments and six replicates were set, i.e., 24 plots for each cultivar. There were 48 plots (3 &#xd7; 13.7 m) in the entire experiment, and each plot had a subplot (1 &#xd7; 1 m). The nitrogen fertilizer treatments were 0 (N0), 90 kg/ha (N1), 180 kg/ha (N2), and 270 kg/ha (N3). The application rates of the phosphate fertilizer and potash fertilizer were the same in each plot. The amount of phosphate fertilizer was 75 kg/ha, and the amount of potash fertilizer was 120 kg/ha. Nitrogen fertilizer was applied twice: 40% of the total amount was applied during the sowing, and the remaining 60% was applied in the jointing stage. The phosphate fertilizer and potash fertilizer were applied once during the sowing.</p>
<fig id="f1" position="float">
<label>Figure&#xa0;1</label>
<caption>
<p>Location of the study area and experimental design.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g001.tif"/>
</fig>
</sec>
<sec id="s2_2">
<label>2.2</label>
<title>Data collection</title>
<sec id="s2_2_1">
<label>2.2.1</label>
<title>Collection and processing of UAV images</title>
<p>In this study, two UAVs (Phantom 4 RTK, SZ DJI Technology Co., Ltd., China), one equipped with a red-green-blue (RGB) camera and the other equipped with an MS camera, were employed to capture RGB and MS images during the winter wheat growing season. The basic parameters of the UAV and onboard sensors are described in <xref ref-type="table" rid="T1">
<bold>Table&#xa0;1</bold>
</xref>.</p>
<table-wrap id="T1" position="float">
<label>Table&#xa0;1</label>
<caption>
<p>Parameters of the UAV and onboard RGB and MS sensors.</p>
</caption>
<table frame="hsides">
<thead>
<tr>
<th valign="middle" align="left">Sensor</th>
<th valign="middle" align="left">Band</th>
<th valign="middle" align="left">Spectral range (nm)</th>
<th valign="middle" align="left">Resolution (pixels)</th>
<th valign="middle" align="left">Field of view (&#xb0;)</th>
<th valign="top" align="left">Positioning accuracy (cm)</th>
<th valign="top" align="left">Ground resolution at 100m height (cm)</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="middle" align="left">RGB camera</td>
<td valign="middle" align="left">RGB</td>
<td valign="middle" align="left">/</td>
<td valign="middle" align="left">5472&#xd7;3648</td>
<td valign="middle" align="left">84</td>
<td valign="middle" align="left">Horizontal: 1<break/>Vertical: 1.5</td>
<td valign="top" align="left">2.74</td>
</tr>
<tr>
<td valign="top" rowspan="5" align="left">MS camera</td>
<td valign="top" align="left">Blue (B)</td>
<td valign="top" align="left">450 &#xb1; 16</td>
<td valign="top" rowspan="5" align="left">1600&#xd7;1300</td>
<td valign="top" rowspan="5" align="left">62.7</td>
<td valign="top" rowspan="5" align="left">Horizontal: 1<break/>Vertical: 1.5</td>
<td valign="top" rowspan="5" align="left">5.3</td>
</tr>
<tr>
<td valign="middle" align="left">Green (G)</td>
<td valign="middle" align="left">560 &#xb1; 16</td>
</tr>
<tr>
<td valign="middle" align="left">Red (R)</td>
<td valign="middle" align="left">650 &#xb1; 16</td>
</tr>
<tr>
<td valign="middle" align="left">Red Edge (RE)</td>
<td valign="middle" align="left">730 &#xb1; 16</td>
</tr>
<tr>
<td valign="middle" align="left">Near Infrared (NIR)</td>
<td valign="middle" align="left">840 &#xb1; 26</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Seven UAV flight missions were conducted during the critical growth stage of the winter wheat. The flight dates and corresponding growth stages are listed in <xref ref-type="table" rid="T2">
<bold>Table&#xa0;2</bold>
</xref>. Under clear weather conditions, the RGB and MS images were collected between 10:00 and 14:00 local time. The flight height of the UAV was 30 m; the forward and side overlap ratios were set to 80% and 70%, respectively.</p>
<table-wrap id="T2" position="float">
<label>Table&#xa0;2</label>
<caption>
<p>Seven UAV fight dates and corresponding wheat growth stages.</p>
</caption>
<table frame="hsides">
<thead>
<tr>
<th valign="top" align="center">Flight date</th>
<th valign="middle" align="center">Growth stage</th>
<th valign="top" align="center">Abbr.</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="center">Mar 16, 2020</td>
<td valign="middle" align="center">Jointing</td>
<td valign="top" align="center">JS</td>
</tr>
<tr>
<td valign="top" align="center">Mar 26, 2020</td>
<td valign="middle" align="center">Booting</td>
<td valign="top" align="center">BS</td>
</tr>
<tr>
<td valign="top" align="center">Apr 2, 2020</td>
<td valign="middle" align="center">Heading</td>
<td valign="top" align="center">HS</td>
</tr>
<tr>
<td valign="top" align="center">Apr 15, 2020</td>
<td valign="middle" align="center">Initial filling</td>
<td valign="top" align="center">IFS</td>
</tr>
<tr>
<td valign="top" align="center">Apr 24, 2020</td>
<td valign="middle" align="center">Middle filling</td>
<td valign="top" align="center">MFS</td>
</tr>
<tr>
<td valign="top" align="center">Apr 29, 2020</td>
<td valign="middle" align="center">Late filling</td>
<td valign="top" align="center">LFS</td>
</tr>
<tr>
<td valign="top" align="center">May 12, 2020</td>
<td valign="middle" align="center">Maturity</td>
<td valign="top" align="center">MS</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>After obtaining the aerial photos of the study area, the photos were preprocessed, comprising two major procedures: (1) image mosaicking in a single period and (2) geometric correction between the mosaicked images in different periods. The image mosaicking included the following steps: image registration of each band, vignetting correction, distortion calibration, and radiation correction. The above image mosaicking steps were all performed using the DJI Terra software (SZ DJI Technology Co., Ltd., China) designed for DJI UAVs. For radiometric calibration, three calibration whiteboards with reflectance values of 25%, 50%, and 75% were placed beneath the flight path of the UAV, and collected in the multispectral sensor. In DJI Terra V3.5.5, the raw image&#x2019;s DN (Digital Number) values were transformed into surface reflectance using a linear correction method (<xref ref-type="bibr" rid="B60">Xia et&#xa0;al., 2022</xref>). The corrected images were mosaicked into multi-temporal RGB and reflectance images of the study area. Then, all of the mosaicked images for the different periods were resampled into images with a resolution of 2 cm. Geometric registration was performed on these resampled images to ensure that the pixel positions of the images in all of the periods corresponded to each other. This process was completed using the ArcGIS software (Esri, Inc., Redlands, CA, USA).</p>
</sec>
<sec id="s2_2_2">
<label>2.2.2</label>
<title>Crop yield measurements</title>
<p>After the wheat matured, the 48 plots and 48 subplots were harvested to obtain yield measurements. The manual harvesting method was used to reduce the error of the yield measurements. The harvested wheat was threshed in the laboratory, and the grain water content was measured. The formula used to calculate the wheat yield is as follows:</p>
<disp-formula>
<label>(1)</label>
<mml:math display="block" id="M1">
<mml:mrow>
<mml:msub>
<mml:mi>Y</mml:mi>
<mml:mi>m</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>10000</mml:mn>
<mml:mo>&#x2217;</mml:mo>
<mml:mtext>G</mml:mtext>
<mml:mo>&#xf7;</mml:mo>
<mml:mtext>A</mml:mtext>
<mml:mo>&#xd7;</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>C</mml:mtext>
<mml:mo>)</mml:mo>
<mml:mo>&#xf7;</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>13</mml:mn>
<mml:mo>%</mml:mo>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:math>
</disp-formula>
<p>where <inline-formula>
<mml:math display="inline" id="im1">
<mml:mrow>
<mml:msub>
<mml:mi>Y</mml:mi>
<mml:mi>m</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> is the wheat yield (kg/ha); G is the weight of the harvested wheat seeds in each plot (kg); A is the plot area (m<sup>2</sup>); C is the grain moisture content (%); and 13% is the wheat standard moisture content (<xref ref-type="bibr" rid="B61">Xin et&#xa0;al., 2008</xref>).</p>
</sec>
</sec>
<sec id="s2_3">
<label>2.3</label>
<title>Yield prediction model development</title>
<p>
<xref ref-type="fig" rid="f2">
<bold>Figure&#xa0;2</bold>
</xref> shows the workflow of the development of the crop yield prediction model in this study, comprising three parts: image collection and processing, feature extraction, and model construction and validation. Section 2.2 introduced the image acquisition and preprocessing. This section mainly describes the image feature extraction and model building.</p>
<fig id="f2" position="float">
<label>Figure&#xa0;2</label>
<caption>
<p>Workflow of the development of the yield prediction model.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g002.tif"/>
</fig>
<sec id="s2_3_1">
<label>2.3.1</label>
<title>Feature extraction</title>
<sec id="s2_3_1_1">
<label>2.3.1.1</label>
<title>Spectral features</title>
<p>The main variables used to represent the spectral features in this study were the original values (i.e., the band reflectance and RGB values) of the UAV MS and RGB images and the vegetation/color indices (<xref ref-type="table" rid="T3">
<bold>Table&#xa0;3</bold>
</xref>) calculated based on the original values.</p>
<table-wrap id="T3" position="float">
<label>Table&#xa0;3</label>
<caption>
<p>Summary of the vegetation/color indices used in this study.</p>
</caption>
<table frame="hsides">
<thead>
<tr>
<th valign="middle" align="left">Sensors</th>
<th valign="middle" align="left">Vegetation/color indices</th>
<th valign="middle" align="left">Abbreviations</th>
<th valign="middle" align="left">Equations</th>
<th valign="middle" align="left">References</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="middle" align="left">MS</td>
<td valign="middle" align="left">Normalized difference vegetation index</td>
<td valign="middle" align="left">NDVI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im2">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>R</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>R</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B45">Rouse et&#xa0;al., 1974</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Green normalized difference vegetation index</td>
<td valign="middle" align="left">GNDVI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im3">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>G</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>G</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B13">Gitelson et&#xa0;al., 2003</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Enhanced vegetation index</td>
<td valign="middle" align="left">EVI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im4">
<mml:mrow>
<mml:mn>2.5</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>R</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn>6</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>R</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>7.5</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>B</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B22">Huete et&#xa0;al., 2002</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Enhanced vegetation index without a blue band</td>
<td valign="middle" align="left">EVI2</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im5">
<mml:mrow>
<mml:mn>2.5</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>R</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn>2.4</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>R</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B24">Jiang et&#xa0;al., 2008</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Modified triangular vegetation index</td>
<td valign="middle" align="left">MTVI2</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im6">
<mml:mrow>
<mml:mn>1.5</mml:mn>
<mml:mo>&#x2217;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">[</mml:mo>
<mml:mrow>
<mml:mn>1.2</mml:mn>
<mml:mo>&#x2217;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>G</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>2.5</mml:mn>
<mml:mo>&#x2217;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>R</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>G</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mtext>*NIR</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>&#x2212;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mn>6</mml:mn>
<mml:mo>&#x2217;</mml:mo>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>5</mml:mn>
<mml:msqrt>
<mml:mi>R</mml:mi>
</mml:msqrt>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>0.5</mml:mn>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B17">Haboudane, 2004</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Soil-adjusted vegetation index</td>
<td valign="middle" align="left">SAVI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im7">
<mml:mrow>
<mml:mn>1.5</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>R</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>R</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn>0.5</mml:mn>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B21">Huete, 1988</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Normalized difference red-edge index</td>
<td valign="middle" align="left">NDVI_RE</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im8">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>RE</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>RE</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B15">Gitelson and Merzlyak, 1994</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Modified simple ratio red-edge index</td>
<td valign="middle" align="left">MSR_RE</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im9">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo stretchy="false">/</mml:mo>
<mml:mtext>RE</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo stretchy="false">/</mml:mo>
<mml:mtext>RE</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B59">Wu et&#xa0;al., 2008</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Red-edge chlorophyll index</td>
<td valign="middle" align="left">CI_RE</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im10">
<mml:mrow>
<mml:mtext>NIR</mml:mtext>
<mml:mo stretchy="false">/</mml:mo>
<mml:mtext>RE</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B13">Gitelson et&#xa0;al., 2003</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left">RGB</td>
<td valign="middle" align="left">Normalized difference index</td>
<td valign="middle" align="left">NDI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im11">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>g</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>r</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>g</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>r</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B58">Woebbecke et&#xa0;al., 1995</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Excess green index</td>
<td valign="middle" align="left">ExG</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im12">
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>g</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>r</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>b</mml:mtext>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B58">Woebbecke et&#xa0;al., 1995</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Excess red index</td>
<td valign="middle" align="left">ExR</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im13">
<mml:mrow>
<mml:mn>1.4</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>r</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>g</mml:mtext>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B39">Meyer and Neto, 2008</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Excess green minus excess red index</td>
<td valign="middle" align="left">ExGR</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im14">
<mml:mrow>
<mml:mn>3</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>g</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>2.4</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>r</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>b</mml:mtext>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B39">Meyer and Neto, 2008</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Visible atmospherically resistant index</td>
<td valign="middle" align="left">VARI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im15">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>g</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>r</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>g</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>r</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>b</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B14">Gitelson et&#xa0;al., 2002</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Green leaf index</td>
<td valign="middle" align="left">GLI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im16">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>g</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>b</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>r</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mo>&#xd7;</mml:mo>
<mml:mtext>g</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>b</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>r</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B32">Louhaichi et&#xa0;al., 2001</xref>
</td>
</tr>
<tr>
<td valign="middle" align="left"/>
<td valign="middle" align="left">Normalized difference yellowness index</td>
<td valign="middle" align="left">NDYI</td>
<td valign="middle" align="left">
<inline-formula>
<mml:math display="inline" id="im17">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>g</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>b</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mtext>g</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext>b</mml:mtext>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
</td>
<td valign="middle" align="left">
<xref ref-type="bibr" rid="B51">Sulik and Long, 2016</xref>
</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn>
<p>R, G, B, Nir, and RE denote the reflectance in the red, green, blue, near-infrared, and red-edge bands for the MS images, respectively; and r, g, and b are the normalized DNs of the red, green, and blue channels for the RGB images, respectively.</p>
</fn>
</table-wrap-foot>
</table-wrap>
</sec>
<sec id="s2_3_1_2">
<label>2.3.1.2</label>
<title>Image textures</title>
<p>The gray level co-occurrence matrix (GLCM) is a frequently utilized and widely adopted method for calculating image texture features, and it was used to represent the image texture feature in this study. The GLCM consists of eight features: the mean (MEA), variance (VAR), homogeneity (HOM), contrast (CON), dissimilarity (DIS), entropy (ENT), second moment (SEM), and correlation (COR). The details of the specific calculation methods have been described by <xref ref-type="bibr" rid="B18">Haralick et&#xa0;al. (1973)</xref>. In this study, a moving window with size of 3&#xd7;3 and a co-occurrence shift of 1 pixel were utilized for texture calculations. The ENVI software (L3Harris Technologies, Inc., Boulder, CO, USA) was used to calculate the GLCM features for seven temporal MS images, and a total of 280 texture features were generated.</p>
</sec>
<sec id="s2_3_1_3">
<label>2.3.1.3</label>
<title>Agronomic traits</title>
<p>Many parameters characterize the growth and development of crops, including biochemical, biophysical, and structural parameters. In this study, three RS-based, available, and independently sourced traits were selected for use in the crop yield prediction.</p>
<sec id="s2_3_1_3_1">
<label>2.3.1.3.1</label>
<title>Canopy height</title>
<p>A digital surface model (DSM) can be obtained using the photogrammetric 3-D point clouds from the UAV RGB images (<xref ref-type="bibr" rid="B5">Colomina and Molina, 2014</xref>; <xref ref-type="bibr" rid="B34">Maimaitijiang et&#xa0;al., 2017</xref>). Therefore, a DSM of the crop canopy was generated from the UAV RGB images during the crop growth and development stages. Similarly, a digital elevation model (DEM) of the bare soil surfaces in the study area was obtained from the UAV flight before wheat germination. The DEM was subtracted from the canopy DSM to obtain the wheat CH [Eq. (2)].</p>
<disp-formula>
<label>(2)</label>
<mml:math display="block" id="M2">
<mml:mrow>
<mml:mtext>CH</mml:mtext>
<mml:mo>=</mml:mo>
<mml:mtext>DSM</mml:mtext>
<mml:mo>&#x2212;</mml:mo>
<mml:mtext>DEM</mml:mtext>
</mml:mrow>
</mml:math>
</disp-formula>
<p>The specific processes were as follows. First, the DEM and canopy DSMs for the different periods were obtained using the DJI Terra software. Second, it was necessary to ensure that the DSM and DEM had the same resolution, and the pixels corresponded to each other. Finally, the CH was calculated pixel by pixel using Eq. (2).</p>
</sec>
<sec id="s2_3_1_3_2">
<label>2.3.1.3.2</label>
<title>Fractional vegetation cover</title>
<p>The FVC is a crucial parameter that describes the spatial pattern of the vegetation types and can serve as an indicator for monitoring vegetation health (<xref ref-type="bibr" rid="B63">Yan et&#xa0;al., 2019</xref>; <xref ref-type="bibr" rid="B11">Gao et&#xa0;al., 2020</xref>). There are currently many RS methods for estimating the FVC (<xref ref-type="bibr" rid="B11">Gao et&#xa0;al., 2020</xref>). In this study, the supervised classification method was used to distinguish between the soil and crop information based on the UAV MS images. Specifically, the support vector machine (SVM) classifier was selected as the supervised classification method to identify crop pixels. Previous studies have shown that the SVM has a higher classification accuracy in the case of relatively limited samples (<xref ref-type="bibr" rid="B42">Mountrakis et&#xa0;al., 2011</xref>; <xref ref-type="bibr" rid="B35">Maimaitijiang et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B55">Wan et&#xa0;al., 2020</xref>). Subsequently, the FVC was calculated using Eq. (3).</p>
<disp-formula>
<label>(3)</label>
<mml:math display="block" id="M3">
<mml:mrow>
<mml:mtext>FVC</mml:mtext>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mtext>c</mml:mtext>
<mml:mtext>n</mml:mtext>
</mml:mfrac>
<mml:mo>&#xd7;</mml:mo>
<mml:mn>100</mml:mn>
</mml:mrow>
</mml:math>
</disp-formula>
<p>where c is the number of crop pixels in the plot, and n is the total number of all pixels in the plot.</p>
</sec>
<sec id="s2_3_1_3_3">
<label>2.3.1.3.3</label>
<title>Canopy volume</title>
<p>The canopy volume (VOL) reflects the three-dimensional structure of the crops during the growth and development stages. Existing studies have used it in crop biomass estimation (<xref ref-type="bibr" rid="B54">Walter et&#xa0;al., 2018</xref>; <xref ref-type="bibr" rid="B36">Maimaitijiang et&#xa0;al., 2019</xref>) and have achieved good estimation results. In this study, we attempted to use the VOL as one of the features for crop yield estimation. The formula for calculating the VOL is as follows:</p>
<disp-formula>
<label>(4)</label>
<mml:math display="block" id="M4">
<mml:mrow>
<mml:mtext>VOL</mml:mtext>
<mml:mo>=</mml:mo>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>c</mml:mi>
</mml:munderover></mml:mstyle>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#xd7;</mml:mo>
<mml:mi>C</mml:mi>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:math>
</disp-formula>
<p>where VOL is the canopy volume; c is the number of crop pixels in the plot; <inline-formula>
<mml:math display="inline" id="im18">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> is the area of the pixel <italic>i</italic>; and <inline-formula>
<mml:math display="inline" id="im19">
<mml:mrow>
<mml:mi>C</mml:mi>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> is the crop height in pixel <italic>i</italic>.</p>
</sec>
</sec>
</sec>
<sec id="s2_3_2">
<label>2.3.2</label>
<title>Yield prediction model</title>
<p>The RF algorithm (<xref ref-type="bibr" rid="B3">Breiman, 2001</xref>) was used to construct the models for wheat yield prediction. The RF belongs to the category of ensemble learning algorithms, and uses the bootstrap sampling method to build a large number of independent decision trees to implement classification and regression tasks. The RF is insensitive to collinearity between variables, can effectively reduce the problem of overfitting, and has been proven to perform well in many studies (e.g., crop parameters, biomass, yield estimation, and image classification) (<xref ref-type="bibr" rid="B30">Li et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B55">Wan et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B19">He et&#xa0;al., 2021</xref>). In this study, the number of decision trees, ntree, was set to 500, and the default values were used for the rest of the RF parameters. There was a total of 96 plot samples (including subplots) in this study, and 2/3 of the data were selected for model training, while the remaining 1/3 of the data were independently employed for model testing.</p>
</sec>
</sec>
<sec id="s2_4">
<label>2.4</label>
<title>Evaluation metrics</title>
<p>The evaluation metrics included Pearson&#x2019;s correlation coefficient (<italic>R</italic>), coefficient of determination (<italic>R<sup>2</sup>
</italic>), root mean square error (<italic>RMSE</italic>), and relative root mean square error (<italic>RRMSE</italic>). The <italic>R</italic> value was used to analyze the relationship between each feature and the crop yield, and the <italic>R<sup>2</sup>
</italic>, <italic>RMSE</italic>, and <italic>RRMSE</italic> values were used to measure the accuracy and error of the yield prediction model. The calculation formulas of the statistical analysis indicators are as follows:</p>
<disp-formula>
<label>(5)</label>
<mml:math display="block" id="M5">
<mml:mrow>
<mml:mi>R</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msubsup>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:msubsup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo>&#xaf;</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>y</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mover accent="true">
<mml:mi>y</mml:mi>
<mml:mo>&#xaf;</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:msqrt>
<mml:mrow>
<mml:msubsup>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo>&#xaf;</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:msubsup>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>y</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mover accent="true">
<mml:mi>y</mml:mi>
<mml:mo>&#xaf;</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula>
<label>(6)</label>
<mml:math display="block" id="M6">
<mml:mrow>
<mml:mi>R</mml:mi>
<mml:mi>M</mml:mi>
<mml:mi>S</mml:mi>
<mml:mi>E</mml:mi>
<mml:mo>=</mml:mo>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mtext>n</mml:mtext>
</mml:mfrac>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mtext>n</mml:mtext>
</mml:munderover></mml:mstyle>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:msub>
<mml:mi>y</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula>
<label>(7)</label>
<mml:math display="block" id="M7">
<mml:mrow>
<mml:mi>R</mml:mi>
<mml:mi>R</mml:mi>
<mml:mi>M</mml:mi>
<mml:mi>S</mml:mi>
<mml:mi>E</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>R</mml:mi>
<mml:mi>M</mml:mi>
<mml:mi>S</mml:mi>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo>&#xaf;</mml:mo>
</mml:mover>
</mml:mfrac>
<mml:mo>&#xd7;</mml:mo>
<mml:mn>100</mml:mn>
<mml:mo>%</mml:mo>
</mml:mrow>
</mml:math>
</disp-formula>
<p>where x and y are the observed and predicted variables, respectively; <inline-formula>
<mml:math display="inline" id="im20">
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo>&#xaf;</mml:mo>
</mml:mover>
</mml:math>
</inline-formula> and <inline-formula>
<mml:math display="inline" id="im21">
<mml:mover accent="true">
<mml:mi>y</mml:mi>
<mml:mo>&#xaf;</mml:mo>
</mml:mover>
</mml:math>
</inline-formula> are the average values; and n is the number of observations.</p>
</sec>
</sec>
<sec id="s3" sec-type="results">
<label>3</label>
<title>Results</title>
<sec id="s3_1">
<label>3.1</label>
<title>Correlations between model features and crop yield</title>
<p>Correlation analysis was conducted to investigate the relationships between the model feature parameters and the crop yield so as to better screen the optimal features for crop yield prediction. <xref ref-type="fig" rid="f3">
<bold>Figures&#xa0;3</bold>
</xref> and <xref ref-type="fig" rid="f4">
<bold>4</bold>
</xref> show the correlations between the features of four categories of features (reflectance, vegetation/color indices, agronomic trait parameters, and textures) and the crop yield, as well as the average values of the correlation coefficients during the different growth periods. In general, among the four categories of features, the agronomic traits have strong correlations with the crop yield, followed by the vegetation/color indices and reflectance, and the texture features exhibit relatively weak correlations. The agronomic trait parameters (FVC, CH, and VOL) have good correlations with the crop yield during each growth stage. They all pass the 0.01 significance level test, and their average correlation coefficients are 0.77, 0.85, and 0.82, respectively (<xref ref-type="fig" rid="f4">
<bold>Figure&#xa0;4</bold>
</xref>). For the vegetation indices, the red-edge vegetation indices (REVIs) have better correlations with the crop yield, and the correlations in the jointing, booting, and heading stages are &gt; 0.9. For the color indices, the NDYI performs better, and the relationships between the other color indices and the crop yield are weaker. For the texture features, except for the red band features, most of the other features exhibit weak correlations.</p>
<fig id="f3" position="float">
<label>Figure&#xa0;3</label>
<caption>
<p>Correlations between various features (i.e., reflectance, vegetation/color indices, agronomic trait parameters and textures) and crop yield. The red font represents that the correlation is significant at the 0.01 level.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g003.tif"/>
</fig>
<fig id="f4" position="float">
<label>Figure&#xa0;4</label>
<caption>
<p>The average values of the correlation coefficients between the yield and remote sensing features in the different growth stages.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g004.tif"/>
</fig>
</sec>
<sec id="s3_2">
<label>3.2</label>
<title>Yield prediction using a single feature</title>
<p>An RF-based yield estimation model was constructed using a single feature, and the yield was predicted using the feature parameters in the different growth stages and during the entire growth period. <xref ref-type="fig" rid="f5">
<bold>Figure&#xa0;5</bold>
</xref> shows the error (<italic>RRMSE</italic>) of the yield prediction result. There are great differences in the yield accuracy obtained using the features in the different growth stages and the different categories (reflectance, vegetation indices, textures, and agronomic trait parameters). Specifically, using the features of the entire growth stage leads to significantly smaller yield errors than using the features of a single growth stage. The errors of the yield prediction obtained using the features of the entire growth stage are 10&#x2013;30.4%, with an average value of 18.7%. Furthermore, the errors of the yield prediction obtained using a single feature are 11.6&#x2013;46.4%, with an average value of 30.1%.</p>
<fig id="f5" position="float">
<label>Figure&#xa0;5</label>
<caption>
<p>The <italic>RRMSEs</italic> (%) of the yield predicted using the remote sensing features of the different growth stages.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g005.tif"/>
</fig>
<p>In addition, the performances of the different categories of feature variables in the yield prediction were compared. <xref ref-type="fig" rid="f6">
<bold>Figure&#xa0;6</bold>
</xref> presents a box plot of the error of the yield prediction of the feature variables of each category (reflectance, vegetation indices, textures, and agronomic trait parameters). The results show that similar to the correlation analysis results, the average error of the yield prediction obtained using the agronomic trait parameters is the smallest, followed by that obtained using the vegetation indices and reflectance, and the relative error of the yield prediction obtained using the texture features is the largest. Overall, the agronomic trait parameters perform the best in the yield prediction, and the error of the yield prediction obtained using the plant height parameter for the entire growth period is the smallest, with an <italic>RRMSE</italic> of 10%.</p>
<fig id="f6" position="float">
<label>Figure&#xa0;6</label>
<caption>
<p>Box plots of the errors of the predicted yield obtained using the different categories of feature parameters.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g006.tif"/>
</fig>
</sec>
<sec id="s3_3">
<label>3.3</label>
<title>Yield prediction using combinations of multiple features</title>
<p>In Sections 3.1 and 3.2, it was found that the different categories of feature parameters have differences in predicting the crop yield. The agronomic trait parameters and vegetation/color indices perform better. Therefore, multiple features of agronomic trait parameters and vegetation/color indices were integrated to determine the best combination of yield prediction features. To compare the vegetation indices with different construction principles, they were subdivided into the commonly used vegetation indices of the near-infrared and visible light bands (ComVIs), the red-edge vegetation indices (REVIs), and the color indices (CIs). <xref ref-type="table" rid="T4">
<bold>Table&#xa0;4</bold>
</xref> shows the error statistics of the optimal yield prediction results for different feature combinations using all of the growth stage data.</p>
<table-wrap id="T4" position="float">
<label>Table&#xa0;4</label>
<caption>
<p>The error statistics of the yield prediction results based on various feature combinations.</p>
</caption>
<table frame="hsides">
<thead>
<tr>
<th valign="top" align="left">Types</th>
<th valign="top" align="left">Feature variables</th>
<th valign="top" align="left">Number of variables</th>
<th valign="top" align="left">Number of combinations</th>
<th valign="top" align="left">Best combination</th>
<th valign="top" align="left">
<italic>RRMSE</italic> (%)</th>
<th valign="top" align="left">
<italic>R<sup>2</sup>
</italic>
</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">ComVIs</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI</td>
<td valign="top" align="left">6</td>
<td valign="top" align="left">63</td>
<td valign="top" align="left">GNDVI, SAVI</td>
<td valign="top" align="left">10.47</td>
<td valign="top" align="left">0.91</td>
</tr>
<tr>
<td valign="top" align="left">REVIs</td>
<td valign="top" align="left">NDVI_RE, MSR_RE, CI_RE</td>
<td valign="top" align="left">3</td>
<td valign="top" align="left">7</td>
<td valign="top" align="left">NDVI_RE</td>
<td valign="top" align="left">10.57</td>
<td valign="top" align="left">0.91</td>
</tr>
<tr>
<td valign="top" align="left">CIs</td>
<td valign="top" align="left">NDI, ExG, ExR, ExGR, VARI, GLI, NDYI</td>
<td valign="top" align="left">7</td>
<td valign="top" align="left">127</td>
<td valign="top" align="left">ExG, VARI</td>
<td valign="top" align="left">15.79</td>
<td valign="top" align="left">0.78</td>
</tr>
<tr>
<td valign="top" align="left">AgTP</td>
<td valign="top" align="left">CH, FVC, VOL</td>
<td valign="top" align="left">3</td>
<td valign="top" align="left">7</td>
<td valign="top" align="left">CH, FVC</td>
<td valign="top" align="left">8.93</td>
<td valign="top" align="left">0.94</td>
</tr>
<tr>
<td valign="top" align="left">ComVIs+ REVIs+CIs</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, NDVI_RE, MSR_RE, CI_RE, NDI, ExG, ExR, ExGR, VARI, GLI, NDYI</td>
<td valign="top" align="left">16</td>
<td valign="top" align="left">65535</td>
<td valign="top" align="left">NDVI_RE, MSR_RE, EVI, SAVI</td>
<td valign="top" align="left">9.88</td>
<td valign="top" align="left">0.92</td>
</tr>
<tr>
<td valign="top" align="left">ComVIs+AgTP</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, CH, FVC, VOL</td>
<td valign="top" align="left">9</td>
<td valign="top" align="left">511</td>
<td valign="top" align="left">CH, FVC, SAVI, GNDVI</td>
<td valign="top" align="left">8.85</td>
<td valign="top" align="left">0.94</td>
</tr>
<tr>
<td valign="top" align="left">REVIs+AgTP</td>
<td valign="top" align="left">NDVI_RE, MSR_RE, CI_RE, CH, FVC, VOL</td>
<td valign="top" align="left">6</td>
<td valign="top" align="left">63</td>
<td valign="top" align="left">CH, FVC, NDVI_RE</td>
<td valign="top" align="left">8.36</td>
<td valign="top" align="left">0.94</td>
</tr>
<tr>
<td valign="top" align="left">CIs+AgTP</td>
<td valign="top" align="left">NDI, ExG, ExR, ExGR, VARI, GLI, NDYI, CH, FVC, VOL</td>
<td valign="top" align="left">10</td>
<td valign="top" align="left">1023</td>
<td valign="top" align="left">CH, FVC, VARI</td>
<td valign="top" align="left">8.52</td>
<td valign="top" align="left">0.95</td>
</tr>
<tr>
<td valign="top" align="left">ComVIs+REVIs+ CIs+AgTP</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, NDVI_RE, MSR_RE, CI_RE, NDI, ExG, ExR, ExGR, VARI, GLI, NDYI, CH, FVC, VOL</td>
<td valign="top" align="left">19</td>
<td valign="top" align="left">524287</td>
<td valign="top" align="left">CH, FVC, NDVI_RE, EVI</td>
<td valign="top" align="left">8.34</td>
<td valign="top" align="left">0.95</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn>
<p>ComVIs, commonly used vegetation indices with near-infrared and visible light bands; REVIs, red-edge vegetation indices; CIs, color indices; AgTP,  agronomic trait parameters.</p>
</fn>
</table-wrap-foot>
</table-wrap>
<p>The results show that the minimum <italic>RRMSE</italic> of the yield prediction, based on the vegetation indices, reduced from 11.6% for a single feature (GNDVI) (<xref ref-type="fig" rid="f5">
<bold>Figure&#xa0;5</bold>
</xref>) to 9.88% for multivariate combinations (NDVI_RE, MSR_RE, EVI, and SAVI) (<xref ref-type="table" rid="T4">
<bold>Table&#xa0;4</bold>
</xref>). There are also differences in the yield prediction accuracy based on the combination of vegetation indices, and the estimation accuracy based on the ComVIs and REVIs is slightly better than that based on the CIs. In addition, combining indices with different construction principles (red-edge vegetation index combined with visible light vegetation index) can improve the estimation accuracy of the yield to some extent.</p>
<p>Among the three agronomic trait parameters, the combination of the CH and FVC has the best yield prediction (<italic>RRMSE</italic> = 8.93% and <italic>R<sup>2</sup> =</italic>0.94), which is better than the yield prediction obtained using a single feature and is also better than the results based on the combinations of vegetation indices. Combining the vegetation indices and agronomic trait parameters further improved the yield prediction accuracy. The <italic>RRMSE</italic> of the optimal combination decreased from 10.47&#x2013;12.65% to 8.34&#x2013;8.85%, and the <italic>R<sup>2</sup>
</italic> increased from 0.88&#x2013;0.91 to 0.94&#x2013;0.95. A scatter plot of the yield prediction versus the measured results is shown in <xref ref-type="fig" rid="f7">
<bold>Figure&#xa0;7</bold>
</xref>. Therefore, adding agronomic trait parameters to the vegetation indices as feature parameters results in a considerable enhancement of yield prediction accuracy.</p>
<fig id="f7" position="float">
<label>Figure&#xa0;7</label>
<caption>
<p>Yield prediction results of the model using the feature combination of the canopy height (CH), fractional vegetation cover (FVC), normalized difference red-edge index (NDVI_RE), and enhanced vegetation index (EVI): <bold>(A)</bold> training set and <bold>(B)</bold> testing set.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g007.tif"/>
</fig>
</sec>
<sec id="s3_4">
<label>3.4</label>
<title>Yield prediction across different growth stages</title>
<p>The crop growth process includes multiple growth stages, and it is quite important to determine how the features of the growth stages affect the yield prediction. This section mainly exhibits the yield prediction performances in the different growth stages for the use of a single feature and combinations of multiple features. According to the yield prediction results based on a single feature presented in Section 3.2, <xref ref-type="fig" rid="f8">
<bold>Figure&#xa0;8</bold>
</xref> shows the average errors in the crop yield predicted using a single feature in the different growth stages. As can be seen from <xref ref-type="fig" rid="f8">
<bold>Figure&#xa0;8</bold>
</xref>, the features in the different growth stages make great differences in the yield prediction results. The <italic>RRMSEs</italic> based on a single feature range from 14.6% to 37.7% across different growth stages. Among the different categories of features, the yield errors predicted using the vegetation indices and agronomic trait parameters are relatively small, whereas errors are relatively large for other feature categories. <xref ref-type="fig" rid="f9">
<bold>Figure&#xa0;9</bold>
</xref> displays the yield prediction results for the different growth stages using combinations of multiple features (vegetation/color indices and agronomic trait parameters, a total of 19 features). The <italic>RRMSEs</italic> based on combinations of multiple features range from 8.5% to 44.6% across different growth stages. The results also indicate that there are still considerable variations in yield prediction at different growth stages. In general, the prediction accuracies were notably greater in the stages of jointing, booting, heading, and early grain-filling compared to later stages of growth, with the heading stage displaying the highest accuracy in yield prediction (<xref ref-type="fig" rid="f9">
<bold>Figure&#xa0;9</bold>
</xref>).</p>
<fig id="f8" position="float">
<label>Figure&#xa0;8</label>
<caption>
<p>The average <italic>RRMSEs</italic> of the crop yields predicted using a single feature in the different growth stages. Refls, Reflectance; VIs, vegetation indices; CIs, color indices; Tex, texture; AgTP, agronomic trait parameters.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g008.tif"/>
</fig>
<fig id="f9" position="float">
<label>Figure&#xa0;9</label>
<caption>
<p>The <italic>RRMSEs</italic> (%) of the crop yields predicted using multiple features in the different growth stages. Left: The colors indicate the <italic>RRMSE</italic> values. The horizontal axis indicates the different growth stages. The vertical axis indicates the different feature combinations of multiple features, and the number of features increases gradually from top to bottom, with a total of 524,287 feature combinations. Upper right: Histogram of the <italic>RRMSE</italic> values; lower right: box charts of the <italic>RRMSE</italic> values for the different growth stages.</p>
</caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpls-14-1217448-g009.tif"/>
</fig>
</sec>
</sec>
<sec id="s4" sec-type="discussion">
<label>4</label>
<title>Discussion</title>
<sec id="s4_1">
<label>4.1</label>
<title>Impact of crop growth stage on yield prediction</title>
<p>In Section 3.4, the study showcased yield predictions across different growth stages, revealing substantial variations in the accuracy of predictions. Notably, the accuracy of yield predictions was found to be superior during the mid-growth phase when compared to the late-growth phase, with the highest accuracy obtained during the heading stage. These findings of this research align with the outcomes of prior studies conducted on wheat (<xref ref-type="bibr" rid="B52">Tanabe et&#xa0;al., 2023</xref>) and rice (<xref ref-type="bibr" rid="B55">Wan et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B56">Wang et&#xa0;al., 2021</xref>). In the later stage of crop growth, the mean and variance of the yield prediction results are large, and the different feature combinations lead to significantly different yield predictions. During the mid-growth stage of crops, the Leaf Area Index (LAI) typically reaches its maximum value, and leaf reflectance in the near-infrared spectrum is at its strongest (<xref ref-type="bibr" rid="B30">Li et&#xa0;al., 2020</xref>). Vegetation indices are primarily constructed based on near-infrared radiation. In this stage, vegetation indices exhibit a strong correlation with biomass and yield. Nonetheless, as leaves senescence begin, the capacity of leaves to reflect near-infrared radiation gradually wanes, culminating in the decreased interpretability of vegetation indices for LAI or biomass. Consequently, this progression adversely impacts the accuracy of yield predictions, leading to the lowest accuracy during the maturity stage (<xref ref-type="bibr" rid="B68">Zhou et&#xa0;al., 2017</xref>; <xref ref-type="bibr" rid="B52">Tanabe et&#xa0;al., 2023</xref>). Similarly, <xref ref-type="bibr" rid="B36">Maimaitijiang et&#xa0;al. (2019)</xref> argued that unlike airborne light detection and ranging (LiDAR), photogrammetric point clouds have insufficient penetration ability when the canopy closure is quite high, which may lead to a decrease in the yield prediction accuracy in the later growth stages. Therefore, the features in the jointing, booting, heading, and early grain-filling stages should be preferentially selected for yield prediction, which contributes to a better performance.</p>
</sec>
<sec id="s4_2">
<label>4.2</label>
<title>Impact of cultivar on yield prediction accuracy</title>
<p>The robustness of the yield prediction models across different cultivars is critical for assessing their application potential (<xref ref-type="bibr" rid="B35">Maimaitijiang et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B8">Duan et&#xa0;al., 2021</xref>). To evaluate the suitability of the yield prediction models among different cultivars, the data for one cultivar were employed for training, while the data for the other cultivar were utilized for testing. Finally, the mean error of the yield prediction results was calculated. Based on the previous analysis, it can be seen that the yield prediction model with multi-feature fusion is more accurate than that with a single feature. Here, we used multi-feature combinations to analyze the robustness of the yield prediction models among different cultivars. By contrasting <xref ref-type="table" rid="T4">
<bold>Tables&#xa0;4</bold>
</xref> and <xref ref-type="table" rid="T5">
<bold>5</bold>
</xref>, it was found that the error of the model, which employed the data for one cultivar to predict the yield of another cultivar, was greater than that of the model trained using the data for both cultivars. The <italic>RRMSE</italic> of the optimal combination of various features increased from 8.34&#x2013;15.79% to 13.90&#x2013;19.23%, and the <italic>R<sup>2</sup>
</italic> decreased from 0.88&#x2013;0.95 to 0.81&#x2013;0.86. The different cultivars of crops have differences in parameters such as phenology, plant height, leaf type, and pigment content. Therefore, the accuracy of the yield prediction models across different cultivars is low. Several recent studies have also reported a decrease in the quality of prediction models for different cultivars (e.g., <xref ref-type="bibr" rid="B44">Rischbeck et&#xa0;al., 2016</xref>; <xref ref-type="bibr" rid="B8">Duan et&#xa0;al., 2021</xref>). <xref ref-type="bibr" rid="B44">Rischbeck et&#xa0;al. (2016)</xref> concluded that models trained using diverse cultivars can significantly improve the yield prediction performance compared to models trained using a single cultivar, which was also concluded in this study. Furthermore, our results support this view.</p>
<table-wrap id="T5" position="float">
<label>Table&#xa0;5</label>
<caption>
<p>Yield prediction results based on various feature combinations and considering cultivar differences.</p>
</caption>
<table frame="hsides">
<thead>
<tr>
<th valign="top" align="left">Types</th>
<th valign="top" align="left">Feature variables</th>
<th valign="top" align="left">Number of variables</th>
<th valign="top" align="left">Number of combinations</th>
<th valign="top" align="left">Best combination</th>
<th valign="top" align="left">
<italic>RRMSE</italic> (%)</th>
<th valign="top" align="left">
<italic>R<sup>2</sup>
</italic>
</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">ComVIs</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI</td>
<td valign="top" align="left">6</td>
<td valign="top" align="left">63</td>
<td valign="top" align="left">EVI</td>
<td valign="top" align="left">15.16</td>
<td valign="top" align="left">0.82</td>
</tr>
<tr>
<td valign="top" align="left">REVIs</td>
<td valign="top" align="left">NDVI_RE, MSR_RE, CI_RE</td>
<td valign="top" align="left">3</td>
<td valign="top" align="left">7</td>
<td valign="top" align="left">NDVI_RE, CI_RE</td>
<td valign="top" align="left">19.23</td>
<td valign="top" align="left">0.81</td>
</tr>
<tr>
<td valign="top" align="left">CIs</td>
<td valign="top" align="left">NDI, ExG, ExR, ExGR, VARI, GLI, NDYI</td>
<td valign="top" align="left">7</td>
<td valign="top" align="left">127</td>
<td valign="top" align="left">ExG, ExR, NDI, VARI</td>
<td valign="top" align="left">15.28</td>
<td valign="top" align="left">0.83</td>
</tr>
<tr>
<td valign="top" align="left">AgTP</td>
<td valign="top" align="left">CH, FVC, VOL</td>
<td valign="top" align="left">3</td>
<td valign="top" align="left">7</td>
<td valign="top" align="left">CH, FVC, VOL</td>
<td valign="top" align="left">15.50</td>
<td valign="top" align="left">0.84</td>
</tr>
<tr>
<td valign="top" align="left">ComVIs+ REVIs+CIs</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, NDVI_RE, MSR_RE, CI_RE, NDI, ExG, ExR, ExGR, VARI, GLI, NDYI</td>
<td valign="top" align="left">16</td>
<td valign="top" align="left">65535</td>
<td valign="top" align="left">EVI, NDI</td>
<td valign="top" align="left">14.32</td>
<td valign="top" align="left">0.85</td>
</tr>
<tr>
<td valign="top" align="left">ComVIs+AgTP</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, CH, FVC, VOL</td>
<td valign="top" align="left">9</td>
<td valign="top" align="left">511</td>
<td valign="top" align="left">CH, EVI, MTVI2</td>
<td valign="top" align="left">14.51</td>
<td valign="top" align="left">0.85</td>
</tr>
<tr>
<td valign="top" align="left">REVIs+AgTP</td>
<td valign="top" align="left">NDVI_RE, MSR_RE, CI_RE, CH, FVC, VOL</td>
<td valign="top" align="left">6</td>
<td valign="top" align="left">63</td>
<td valign="top" align="left">CH, FVC, VOL</td>
<td valign="top" align="left">15.50</td>
<td valign="top" align="left">0.84</td>
</tr>
<tr>
<td valign="top" align="left">CIs+AgTP</td>
<td valign="top" align="left">NDI, ExG, ExR, ExGR, VARI, GLI, NDYI, CH, FVC, VOL</td>
<td valign="top" align="left">10</td>
<td valign="top" align="left">1023</td>
<td valign="top" align="left">CH, ExR, NDI</td>
<td valign="top" align="left">14.28</td>
<td valign="top" align="left">0.85</td>
</tr>
<tr>
<td valign="top" align="left">ComVIs+REVIs+ CIs+AgTP</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, NDVI_RE, MSR_RE, CI_RE, NDI, ExG, ExR, ExGR, VARI, GLI, NDYI, CH, FVC, VOL</td>
<td valign="top" align="left">19</td>
<td valign="top" align="left">524287</td>
<td valign="top" align="left">CH, EVI, NDI</td>
<td valign="top" align="left">13.90</td>
<td valign="top" align="left">0.86</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn>
<p>The features of one cultivar were used for training, and the data for another cultivar were used for testing. The values of the error statistics are the average of the two scenarios.</p>
</fn>
</table-wrap-foot>
</table-wrap>
<p>The results of our study indicate that the use of a combination of multi-temporal and multi-features can enhance the yield prediction performance. Therefore, it is quite essential to identify better feature combinations to improve the robustness of the yield prediction models across different cultivars. <xref ref-type="table" rid="T5">
<bold>Table&#xa0;5</bold>
</xref> presents the yield prediction error metrics for various feature combinations across different categories. The results illustrate that the prediction abilities of various feature combinations are different among different cultivars, and the yield prediction accuracy is improved when the agronomic trait parameters are incorporated into the vegetation indices and color indices. This also indicates that the CH, which reflects the vertical growth characteristics of a crop and is one of the important agronomic trait parameters, can better characterize the information about the crop structure and help strengthen the capability of the yield prediction model across cultivars. The combination of the CH, EVI, and NDI indices produced the highest prediction accuracy, with an <italic>RRMSE</italic> of 13.9% and an <italic>R<sup>2</sup>
</italic> of 0.86. For the yield prediction models that do not consider cultivars, the REVIs produce larger prediction errors across cultivars.</p>
</sec>
<sec id="s4_3">
<label>4.3</label>
<title>Importance of using agronomic trait parameters in yield prediction</title>
<p>Through analysis of the previously presented results, we found that when using a single feature for yield prediction, the agronomic trait parameters performed the best overall. Three agronomic trait parameters were used in this study: the CH, FVC, and VOL. Among them, the CH performed best in the yield prediction, followed by the FVC, and finally, the VOL had the weakest performance. The plant CH can reflect the vertical growth characteristics of the crop, can better reflect the information about the crop structure, and can help to improve the yield prediction ability. Since the canopy volume was calculated based on the CH and vegetation coverage, there was an autocorrelation problem, so the performance was not as good as expected.</p>
<p>Furthermore, the models for yield prediction, which incorporated agronomic trait parameters along with spectral features, also demonstrated enhanced accuracy. Existing studies on biomass and yield prediction of other crops (barley, soybean, and corn) have also found that data fusion of spectral and agronomic features can improve the performance (<xref ref-type="bibr" rid="B12">Geipel et&#xa0;al., 2014</xref>; <xref ref-type="bibr" rid="B2">Bendig et&#xa0;al., 2015</xref>; <xref ref-type="bibr" rid="B36">Maimaitijiang et&#xa0;al., 2019</xref>), and this study further supplements related conclusions. The fusion of spectral features and agronomic trait parameters has led to an enhancement in yield prediction accuracy, which can be explained from several perspectives. Firstly, spectral features effectively capture the crop growth status, while multi-temporal spectral features can reflect the entire crop growth and development process (<xref ref-type="bibr" rid="B36">Maimaitijiang et&#xa0;al., 2019</xref>; <xref ref-type="bibr" rid="B35">Maimaitijiang et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B55">Wan et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B52">Tanabe et&#xa0;al., 2023</xref>). Secondly, as mentioned earlier, agronomic trait parameters provide valuable insights into crop structural information, particularly vertical growth characteristics that are not easily obtained through spectral features alone. Thirdly, these three agronomic parameters were obtained using UAV-based RGB and MS sensors, which were independent data sources and were not calculated using spectral indices. There was no autocorrelation with the spectral indices, which overcame the inherent asymptotic saturation problem of the spectral features to a certain extent (<xref ref-type="bibr" rid="B34">Maimaitijiang et&#xa0;al., 2017</xref>; <xref ref-type="bibr" rid="B35">Maimaitijiang et&#xa0;al., 2020</xref>). Therefore, considering the easy availability and cost-effectiveness of obtaining UAV-based agronomic trait parameters, the fusion of spectral indices and agronomic trait parameters has great potential for improving crop yield predictions.</p>
</sec>
<sec id="s4_4">
<label>4.4</label>
<title>Comparison of yield predictions using RGB and MS images</title>
<p>The features used in this study were all calculated from images acquired by RGB and MS sensors. The VIs and FVC were derived from the MS data, the CIs and CH were derived from the RGB data, and the VOL was calculated based on the CH and FVC, i.e., from a combination of RGB and MS images. Our results confirm that multi-sensor data fusion improves the accuracies of the yield prediction models. While researchers hope to enhance the capacity of the yield prediction, they also expect to achieve this goal at a less cost (e.g., economic cost, time cost, and computational cost). That is, within the range of acceptable accuracy, fewer data and lower costs are more feasible for large-scale applications. Therefore, in this section, we compare the performances of the RGB and MS images in the yield prediction.</p>
<p>
<xref ref-type="table" rid="T6">
<bold>Table&#xa0;6</bold>
</xref> shows the yield prediction results obtained using various features obtained from the RGB and MS images. The results indicate that the best yield prediction results were obtained using a combination of the VIs and FVC from the MS sensor, with <italic>RRMSE</italic> = 8.94% and <italic>R<sup>2</sup> =</italic>0.94. The best yield prediction results from the CIs and CH from the RGB sensor had <italic>RRMSE</italic> = 10.29% and <italic>R<sup>2</sup> =</italic>0.91. The yield prediction accuracy of the MS-based VIs and FVC was better than that of the RGB-based features. For the RGB-based features, the CH still outperformed the other CIs in terms of the yield prediction, while for the MS-based features, the combination of features involving red-edge indices had a better performance. Red-edge light has a better penetration effect than other visible light bands, is not easily saturated when the vegetation canopy density is high, and is more sensitive to chlorophyll (<xref ref-type="bibr" rid="B6">Dong et&#xa0;al., 2019</xref>; <xref ref-type="bibr" rid="B46">Sagan et&#xa0;al., 2021</xref>; <xref ref-type="bibr" rid="B66">Zeng et&#xa0;al., 2022</xref>).</p>
<table-wrap id="T6" position="float">
<label>Table&#xa0;6</label>
<caption>
<p>Comparison of yield prediction using the RGB and MS images.</p>
</caption>
<table frame="hsides">
<thead>
<tr>
<th valign="top" align="left">Sensors</th>
<th valign="top" align="left">Types</th>
<th valign="top" align="left">Feature variables</th>
<th valign="top" align="left">Number of variables</th>
<th valign="top" align="left">Number of combinations</th>
<th valign="top" align="left">Best combination</th>
<th valign="top" align="left">
<italic>RRMSE</italic> (%)</th>
<th valign="top" align="left">
<italic>R<sup>2</sup>
</italic>
</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">MS</td>
<td valign="top" align="left">VIs</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, NDVI_RE, MSR_RE, CI_RE</td>
<td valign="top" align="left">9</td>
<td valign="top" align="left">511</td>
<td valign="top" align="left">NDVI_RE, MSR_RE, SAVI</td>
<td valign="top" align="left">9.72</td>
<td valign="top" align="left">0.93</td>
</tr>
<tr>
<td valign="top" align="left"/>
<td valign="top" align="left">VIs+FVC</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, NDVI_RE, MSR_RE, CI_RE, FVC</td>
<td valign="top" align="left">10</td>
<td valign="top" align="left">1023</td>
<td valign="top" align="left">FVC, CI_RE, SAVI</td>
<td valign="top" align="left">8.94</td>
<td valign="top" align="left">0.94</td>
</tr>
<tr>
<td valign="top" align="left">RGB</td>
<td valign="top" align="left">CIs</td>
<td valign="top" align="left">NDI, ExG, ExR, ExGR, VARI, GLI, NDYI</td>
<td valign="top" align="left">7</td>
<td valign="top" align="left">127</td>
<td valign="top" align="left">ExG, VARI</td>
<td valign="top" align="left">15.79</td>
<td valign="top" align="left">0.78</td>
</tr>
<tr>
<td valign="top" align="left"/>
<td valign="top" align="left">CIs+CH</td>
<td valign="top" align="left">NDI, ExG, ExR, ExGR, VARI, GLI, NDYI, CH</td>
<td valign="top" align="left">8</td>
<td valign="top" align="left">255</td>
<td valign="top" align="left">CH</td>
<td valign="top" align="left">10.29</td>
<td valign="top" align="left">0.91</td>
</tr>
<tr>
<td valign="top" align="left">MS+RGB</td>
<td valign="top" align="left">CIs+VIs+CH+FVC+VOL</td>
<td valign="top" align="left">NDVI, GNDVI, EVI, EVI2, MTVI2, SAVI, NDVI_RE, MSR_RE, CI_RE, NDI, ExG, ExR, ExGR, VARI, GLI, NDYI, CH, FVC, VOL</td>
<td valign="top" align="left">19</td>
<td valign="top" align="left">524287</td>
<td valign="top" align="left">CH, FVC, NDVI_RE, EVI</td>
<td valign="top" align="left">8.34</td>
<td valign="top" align="left">0.95</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>These research results demonstrate that the features that fuse MS and RGB image data have the best yield prediction performance, followed by the MS-based features, and the RGB-based features have the weakest performance. A UAV equipped with an RGB camera is the most common configuration for agricultural RS applications, and this configuration has the advantages of simplicity, convenience, and low cost. Our results show that if the purpose of the research is to understand the crop yield status and the trend from a macroscopic perspective, the RGB-based yield prediction model can fully meet the requirements within the acceptable accuracy range. If the goal is to determine the crop yield more accurately, the use of features obtained from multi-sensor fusion is recommended for yield prediction.</p>
</sec>
<sec id="s4_5">
<label>4.5</label>
<title>Strengths and limitations of this study and future work</title>
<p>The significant timeliness and operability of UAVs overcome the disadvantages of the spatiotemporal resolution of satellite RS data in precision agricultural applications. UAV-based crop yield prediction has always been an active topic in the field of precision agricultural RS. In this study, RGB and MS images were acquired using a UAV, and crop yield prediction models were constructed based on the RF algorithm and a combination of spectral features and agronomic trait parameters. The results revealed that the model integrating agronomic trait parameters and spectral features enhance the accuracy of the crop yield prediction (<xref ref-type="table" rid="T4">
<bold>Table&#xa0;4</bold>
</xref>; <xref ref-type="fig" rid="f7">
<bold>Figure&#xa0;7</bold>
</xref>), and the addition of agronomic trait parameters addressed the issue of reduced prediction capacity across different cultivars to some extent (<xref ref-type="table" rid="T5">
<bold>Table&#xa0;5</bold>
</xref>). In addition, these agronomic trait parameters are easy to obtain at a low cost, so they represent a great potential solution for crop yield prediction at medium and small scales.</p>
<p>Certainly, there were still some limitations in this study. The experiment duration was limited to only one year, and the sample size was relatively small. Multi-year experiments and larger sample sizes would enable a more comprehensive and systematic testing of the crop yield prediction model and feature parameters. Much work remains to be done in the future regarding UAV-based crop yield prediction. First, experiments in different climatic regions need to be conducted to verify the robustness of the yield prediction models across different climatic regions. Experiments involving different crops and different cultivars of the same crop need to be conducted to examine the reliability and suitability of the yield prediction models across crops and cultivars. Second, our research results confirm that multi-data fusion can effectively upgrade the performance of the yield prediction model. The fusion of structural and spectral parameters of crops was adopted in this study. Exploring multi-data fusion, such as thermal infrared, LiDAR, or environmental data, remains a future research focus (<xref ref-type="bibr" rid="B35">Maimaitijiang et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B28">Li et&#xa0;al., 2022</xref>; <xref ref-type="bibr" rid="B43">Qader et&#xa0;al., 2023</xref>). In addition, in terms of machine learning algorithms, previous studies have used deep learning algorithms for yield prediction and have achieved good results (<xref ref-type="bibr" rid="B25">Khaki and Wang, 2019</xref>; <xref ref-type="bibr" rid="B26">Khaki et&#xa0;al., 2020</xref>; <xref ref-type="bibr" rid="B46">Sagan et&#xa0;al., 2021</xref>; <xref ref-type="bibr" rid="B23">Jeong et&#xa0;al., 2022</xref>). We also plan to explore the performances of deep learning algorithms in UAV-based yield prediction models in the future.</p>
</sec>
</sec>
<sec id="s5" sec-type="conclusions">
<label>5</label>
<title>Conclusions</title>
<p>Agronomic trait parameters are closely related to crop growth, development, and yield formation. In this study, crop canopy spectral parameters (VIs) and agronomic trait parameters (plant height and coverage) obtained using low-cost UAVs were combined to predict the crop yield. The potential of agronomic trait parameters was also investigated. The main conclusions of this study are as follows:</p>
<list list-type="simple">
<list-item>
<p>(1) The agronomic trait parameters and spectral features had strong relationships with the crop yield, while the texture features had relatively weak relationships with the crop yield. Compared with the yield prediction using spectral features, the addition of agronomic trait parameters effectively improved the yield prediction accuracy.</p>
</list-item>
<list-item>
<p>(2) The yield prediction results based on the features in the different growth stages were quite different. In general, the prediction accuracies were noticeably greater in the jointing, booting, heading, and early grain-filling stages as compared to the later growth stages. Early yield predictions were most precise during the heading stage. Multiple growth stages provided a better yield prediction performance than a single stage.</p>
</list-item>
<list-item>
<p>(3) The yield prediction across different cultivars was weaker than that for the same cultivar. However, the combination of crop trait parameters and spectral indices improved the yield prediction among cultivars to some extent.</p>
</list-item>
<list-item>
<p>(4) The features based on MS and RGB fusion had the best performance in terms of the yield prediction, followed by the MS-based features, and the RGB-based features had the weakest performance. It should be noted that the accuracy of the RGB-based yield prediction models also fell within the acceptable accuracy range. Therefore, they meet the requirements for understanding the crop yield status and trends from a macroscopic perspective.</p>
</list-item>
</list>
</sec>
<sec id="s6" sec-type="data-availability">
<title>Data availability statement</title>
<p>The original contributions presented in the study are included in the article/supplementary material. Further inquiries can be directed to the corresponding authors.</p>
</sec>
<sec id="s7" sec-type="author-contributions">
<title>Author contributions</title>
<p>HZ, LS, and HH contributed to conception and design of the study. HZ performed the statistical analysis and wrote the first draft of the manuscript. JY performed the software and programming. WL and DL collected and organized the data. All authors contributed to the article and approved the submitted version.</p>
</sec>
</body>
<back>
<sec id="s8" sec-type="funding-information">
<title>Funding</title>
<p>This work was funded by Natural Science Foundation of Zhejiang Province (Grant No. LY21D010004), the National Natural Science Foundation of China (Grant No. 41907394, 41901360) and Innovation Platform of Smart Breeding in Modern Biology.</p>
</sec>
<sec id="s9" sec-type="COI-statement">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="s10" sec-type="disclaimer">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Asseng</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Ewert</surname> <given-names>F.</given-names>
</name>
<name>
<surname>Rosenzweig</surname> <given-names>C.</given-names>
</name>
<name>
<surname>Jones</surname> <given-names>J. W.</given-names>
</name>
<name>
<surname>Hatfield</surname> <given-names>J. L.</given-names>
</name>
<name>
<surname>Ruane</surname> <given-names>A. C.</given-names>
</name>
<etal/>
</person-group>. (<year>2013</year>). <article-title>Uncertainty in simulating wheat yields under climate change</article-title>. <source>Nat. Clim. Chang.</source> <volume>3</volume> (<issue>9</issue>), <fpage>827</fpage>&#x2013;<lpage>832</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nclimate1916</pub-id>
</citation>
</ref>
<ref id="B2">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bendig</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Yu</surname> <given-names>K.</given-names>
</name>
<name>
<surname>Aasen</surname> <given-names>H.</given-names>
</name>
<name>
<surname>Bolten</surname> <given-names>A.</given-names>
</name>
<name>
<surname>Bennertz</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Broscheit</surname> <given-names>J.</given-names>
</name>
<etal/>
</person-group>. (<year>2015</year>). <article-title>Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley</article-title>. <source>Int. J. Appl. Earth Obs. Geoinf.</source> <volume>39</volume>, <fpage>79</fpage>&#x2013;<lpage>87</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jag.2015.02.012</pub-id>
</citation>
</ref>
<ref id="B3">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Breiman</surname> <given-names>L.</given-names>
</name>
</person-group> (<year>2001</year>). <article-title>Random forests</article-title>. <source>Mach. Learn.</source> <volume>45</volume>, <fpage>5</fpage>&#x2013;<lpage>32</lpage>. doi: <pub-id pub-id-type="doi">10.1023/A:1010933404324</pub-id>
</citation>
</ref>
<ref id="B4">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cao</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Zhang</surname> <given-names>Z.</given-names>
</name>
<name>
<surname>Luo</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Zhang</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Zhang</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Li</surname> <given-names>Z.</given-names>
</name>
<etal/>
</person-group>. (<year>2021</year>). <article-title>Wheat yield predictions at a county and field scale with deep learning, machine learning, and google earth engine</article-title>. <source>Eur. J. Agron.</source> <volume>123</volume>, <elocation-id>126204</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.eja.2020.126204</pub-id>
</citation>
</ref>
<ref id="B5">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Colomina</surname> <given-names>I.</given-names>
</name>
<name>
<surname>Molina</surname> <given-names>P.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>Unmanned aerial systems for photogrammetry and remote sensing: a review</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>92</volume>, <fpage>79</fpage>&#x2013;<lpage>97</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2014.02.013</pub-id>
</citation>
</ref>
<ref id="B6">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dong</surname> <given-names>T.</given-names>
</name>
<name>
<surname>Liu</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Shang</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Qian</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Ma</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Kovacs</surname> <given-names>J. M.</given-names>
</name>
<etal/>
</person-group>. (<year>2019</year>). <article-title>Assessment of red-edge vegetation indices for crop leaf area index estimation</article-title>. <source>Remote Sens. Environ.</source> <volume>222</volume>, <fpage>133</fpage>&#x2013;<lpage>143</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2018.12.032</pub-id>
</citation>
</ref>
<ref id="B7">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dong</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Lu</surname> <given-names>H.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Ye</surname> <given-names>T.</given-names>
</name>
<name>
<surname>Yuan</surname> <given-names>W.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Estimating winter wheat yield based on a light use efficiency model and wheat variety data</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>160</volume>, <fpage>18</fpage>&#x2013;<lpage>32</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2019.12.005</pub-id>
</citation>
</ref>
<ref id="B8">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Duan</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Fang</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Gong</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Peng</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Wu</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Zhu</surname> <given-names>R.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Remote estimation of grain yield based on UAV data in different rice cultivars under contrasting climatic zone</article-title>. <source>Field Crop Res.</source> <volume>267</volume>, <elocation-id>108148</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.fcr.2021.108148</pub-id>
</citation>
</ref>
<ref id="B9">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fang</surname> <given-names>H.</given-names>
</name>
<name>
<surname>Liang</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Hoogenboom</surname> <given-names>G.</given-names>
</name>
</person-group> (<year>2011</year>). <article-title>Integration of MODIS LAI and vegetation index products with the CSM&#x2013;CERES&#x2013;Maize model for corn yield estimation</article-title>. <source>Int. J. Remote Sens.</source> <volume>32</volume> (<issue>4</issue>), <fpage>1039</fpage>&#x2013;<lpage>1065</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1080/01431160903505310</pub-id>
</citation>
</ref>
<ref id="B10">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Filippi</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Jones</surname> <given-names>E. J.</given-names>
</name>
<name>
<surname>Wimalathunge</surname> <given-names>N. S.</given-names>
</name>
<name>
<surname>Somarathna</surname> <given-names>P. D. S. N.</given-names>
</name>
<name>
<surname>Pozza</surname> <given-names>L. E.</given-names>
</name>
<name>
<surname>Ugbaje</surname> <given-names>S. U.</given-names>
</name>
<etal/>
</person-group>. (<year>2019</year>). <article-title>An approach to forecast grain crop yield using multi-layered, multi-farm data sets and machine learning</article-title>. <source>Precis. Agric.</source> <volume>20</volume> (<issue>5</issue>), <fpage>1015</fpage>&#x2013;<lpage>1029</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1007/s11119-018-09628-4</pub-id>
</citation>
</ref>
<ref id="B11">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gao</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Johnson</surname> <given-names>B. A.</given-names>
</name>
<name>
<surname>Tian</surname> <given-names>Q.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Verrelst</surname> <given-names>J.</given-names>
</name>
<etal/>
</person-group>. (<year>2020</year>). <article-title>Remote sensing algorithms for estimation of fractional vegetation cover using pure vegetation index values: a review</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>159</volume>, <fpage>364</fpage>&#x2013;<lpage>377</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2019.11.018</pub-id>
</citation>
</ref>
<ref id="B12">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Geipel</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Link</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Claupein</surname> <given-names>W.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system</article-title>. <source>Remote Sens.</source> <volume>6</volume> (<issue>11</issue>), <fpage>10335</fpage>&#x2013;<lpage>10355</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.3390/rs61110335</pub-id>
</citation>
</ref>
<ref id="B13">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gitelson</surname> <given-names>A. A.</given-names>
</name>
<name>
<surname>Gritz</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Merzlyak</surname> <given-names>M. N.</given-names>
</name>
</person-group> (<year>2003</year>). <article-title>Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves</article-title>. <source>J. Plant Physiol.</source> <volume>160</volume> (<issue>3</issue>), <fpage>271</fpage>&#x2013;<lpage>282</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1078/0176-1617-00887</pub-id>
</citation>
</ref>
<ref id="B14">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gitelson</surname> <given-names>A. A.</given-names>
</name>
<name>
<surname>Kaufman</surname> <given-names>Y. J.</given-names>
</name>
<name>
<surname>Stark</surname> <given-names>R.</given-names>
</name>
<name>
<surname>Rundquist</surname> <given-names>D.</given-names>
</name>
</person-group> (<year>2002</year>). <article-title>Novel algorithms for remote estimation of vegetation fraction</article-title>. <source>Remote Sens. Environ.</source> <volume>80</volume> (<issue>1</issue>), <fpage>76</fpage>&#x2013;<lpage>87</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/S0034-4257(01)00289-9</pub-id>
</citation>
</ref>
<ref id="B15">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gitelson</surname> <given-names>A. A.</given-names>
</name>
<name>
<surname>Merzlyak</surname> <given-names>M. N.</given-names>
</name>
</person-group> (<year>1994</year>). <article-title>Spectral reflectance changes associated with autumn senescence of aesculus hippocastanum l. And acer platanoides l. Leaves. Spectral features and relation to chlorophyll estimation</article-title>. <source>J. Plant Physiol.</source> <volume>143</volume> (<issue>3</issue>), <fpage>286</fpage>&#x2013;<lpage>292</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/S0176-1617(11)81633-0</pub-id>
</citation>
</ref>
<ref id="B16">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Guo</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Chen</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Li</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Cunha</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Jayavelu</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Cammarano</surname> <given-names>D.</given-names>
</name>
<etal/>
</person-group>. (<year>2022</year>). <article-title>Machine learning-based approaches for predicting SPAD values of maize using multi-spectral images</article-title>. <source>Remote Sens.</source> <volume>14</volume> (<issue>6</issue>), <elocation-id>1337</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.3390/rs14061337</pub-id>
</citation>
</ref>
<ref id="B17">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Haboudane</surname> <given-names>D.</given-names>
</name>
</person-group> (<year>2004</year>). <article-title>Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: modeling and validation in the context of precision agriculture</article-title>. <source>Remote Sens. Environ.</source> <volume>90</volume> (<issue>3</issue>), <fpage>337</fpage>&#x2013;<lpage>352</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2003.12.013</pub-id>
</citation>
</ref>
<ref id="B18">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Haralick</surname> <given-names>R. M.</given-names>
</name>
<name>
<surname>Shanmugam</surname> <given-names>K.</given-names>
</name>
<name>
<surname>Dinstein</surname> <given-names>I.</given-names>
</name>
</person-group> (<year>1973</year>). <article-title>Textural features for image classification</article-title>. <source>IEEE Trans. systems man cybernetics</source> <volume>6)</volume>, <fpage>610</fpage>&#x2013;<lpage>621</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1109/TSMC.1973.4309314</pub-id>
</citation>
</ref>
<ref id="B19">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>He</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Dong</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Liao</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Sun</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>Z.</given-names>
</name>
<name>
<surname>You</surname> <given-names>N.</given-names>
</name>
<etal/>
</person-group>. (<year>2021</year>). <article-title>Examining rice distribution and cropping intensity in a mixed single- and double-cropping region in south China using all available Sentinel 1/2 images</article-title>. <source>Int. J. Appl. Earth Obs. Geoinf.</source> <volume>101</volume>, <elocation-id>102351</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.jag.2021.102351</pub-id>
</citation>
</ref>
<ref id="B20">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Huang</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Tian</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Liang</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Ma</surname> <given-names>H.</given-names>
</name>
<name>
<surname>Becker-Reshef</surname> <given-names>I.</given-names>
</name>
<name>
<surname>Huang</surname> <given-names>Y.</given-names>
</name>
<etal/>
</person-group>. (<year>2015</year>). <article-title>Improving winter wheat yield estimation by assimilation of the leaf area index from Landsat TM and MODIS data into the WOFOST model</article-title>. <source>Agric. For. Meteorol.</source> <volume>204</volume>, <fpage>106</fpage>&#x2013;<lpage>121</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.agrformet.2015.02.001</pub-id>
</citation>
</ref>
<ref id="B21">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Huete</surname> <given-names>A.</given-names>
</name>
</person-group> (<year>1988</year>). <article-title>A soil-adjusted vegetation index (SAVI)</article-title>. <source>Remote Sens. Environ.</source> <volume>3</volume> (<issue>25</issue>), <fpage>295</fpage>&#x2013;<lpage>309</lpage>. doi: <pub-id pub-id-type="doi">10.1016/0034-4257(88)90106-X</pub-id>
</citation>
</ref>
<ref id="B22">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Huete</surname> <given-names>A.</given-names>
</name>
<name>
<surname>Didan</surname> <given-names>K.</given-names>
</name>
<name>
<surname>Miura</surname> <given-names>T.</given-names>
</name>
<name>
<surname>Rodriguez</surname> <given-names>E. P.</given-names>
</name>
<name>
<surname>Gao</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Ferreira</surname> <given-names>L. G.</given-names>
</name>
</person-group> (<year>2002</year>). <article-title>Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens</article-title>. <source>Environ.</source> <volume>83</volume> (<issue>1-2</issue>), <fpage>195</fpage>&#x2013;<lpage>213</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/S0034-4257(02)00096-2</pub-id>
</citation>
</ref>
<ref id="B23">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jeong</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Ko</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Yeom</surname> <given-names>J.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Predicting rice yield at pixel scale through synthetic use of crop and deep learning models with satellite data in South and north Korea</article-title>. <source>Sci. Total Environ.</source> <volume>802</volume>, <elocation-id>149726</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.scitotenv.2021.149726</pub-id>
</citation>
</ref>
<ref id="B24">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jiang</surname> <given-names>Z.</given-names>
</name>
<name>
<surname>Huete</surname> <given-names>A.</given-names>
</name>
<name>
<surname>Didan</surname> <given-names>K.</given-names>
</name>
<name>
<surname>Miura</surname> <given-names>T.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Development of a two-band enhanced vegetation index without a blue band</article-title>. <source>Remote Sens. Environ.</source> <volume>112</volume> (<issue>10</issue>), <fpage>3833</fpage>&#x2013;<lpage>3845</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2008.06.006</pub-id>
</citation>
</ref>
<ref id="B25">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Khaki</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>L.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Crop yield prediction using deep neural networks</article-title>. <source>Front. Plant Sci.</source> <volume>10</volume>, <elocation-id>621</elocation-id>. doi: <pub-id pub-id-type="doi">10.3389/fpls.2019.00621</pub-id>
</citation>
</ref>
<ref id="B26">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Khaki</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Archontoulis</surname> <given-names>S. V.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>A CNN-RNN framework for crop yield prediction</article-title>. <source>Front. Plant Sci.</source> <volume>10</volume>. doi:&#xa0;<pub-id pub-id-type="doi">10.3389/fpls.2019.01750</pub-id>
</citation>
</ref>
<ref id="B27">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lambert</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Traor&#xe9;</surname> <given-names>P. C. S.</given-names>
</name>
<name>
<surname>Blaes</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Baret</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Defourny</surname> <given-names>P.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Estimating smallholder crops production at village level from Sentinel-2 time series in Mali&#x2019;s cotton belt</article-title>. <source>Remote Sens. Environ.</source> <volume>216</volume>, <fpage>647</fpage>&#x2013;<lpage>657</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2018.06.036</pub-id>
</citation>
</ref>
<ref id="B28">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname> <given-names>Z.</given-names>
</name>
<name>
<surname>Ding</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Xu</surname> <given-names>D.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Exploring the potential role of environmental and multi-source satellite data in crop yield prediction across northeast China</article-title>. <source>Sci. Total Environ.</source> <volume>815</volume>, <elocation-id>152880</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.scitotenv.2021.152880</pub-id>
</citation>
</ref>
<ref id="B29">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname> <given-names>D.</given-names>
</name>
<name>
<surname>Miao</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Gupta</surname> <given-names>S. K.</given-names>
</name>
<name>
<surname>Rosen</surname> <given-names>C. J.</given-names>
</name>
<name>
<surname>Yuan</surname> <given-names>F.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>C.</given-names>
</name>
<etal/>
</person-group>. (<year>2021</year>). <article-title>Improving potato yield prediction by combining cultivar information and UAV remote sensing data using machine learning</article-title>. <source>Remote Sens.</source> <volume>13</volume> (<issue>16</issue>), <elocation-id>3322</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.3390/rs13163322</pub-id>
</citation>
</ref>
<ref id="B30">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Xu</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Zhang</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Han</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Bian</surname> <given-names>C.</given-names>
</name>
<name>
<surname>Li</surname> <given-names>G.</given-names>
</name>
<etal/>
</person-group>. (<year>2020</year>). <article-title>Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>162</volume>, <fpage>161</fpage>&#x2013;<lpage>172</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2020.02.013</pub-id>
</citation>
</ref>
<ref id="B31">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lobell</surname> <given-names>D. B.</given-names>
</name>
<name>
<surname>Thau</surname> <given-names>D.</given-names>
</name>
<name>
<surname>Seifert</surname> <given-names>C.</given-names>
</name>
<name>
<surname>Engle</surname> <given-names>E.</given-names>
</name>
<name>
<surname>Little</surname> <given-names>B.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>A scalable satellite-based crop yield mapper</article-title>. <source>Remote Sens. Environ.</source> <volume>164</volume>, <fpage>324</fpage>&#x2013;<lpage>333</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2015.04.021</pub-id>
</citation>
</ref>
<ref id="B32">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Louhaichi</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Borman</surname> <given-names>M. M.</given-names>
</name>
<name>
<surname>Johnson</surname> <given-names>D. E.</given-names>
</name>
</person-group> (<year>2001</year>). <article-title>Spatially located platform and aerial photography for documentation of grazing impacts on wheat</article-title>. <source>Geocarto Int.</source> <volume>16</volume> (<issue>1</issue>), <fpage>65</fpage>&#x2013;<lpage>70</lpage>. doi: <pub-id pub-id-type="doi">10.1080/10106040108542184</pub-id>
</citation>
</ref>
<ref id="B33">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maes</surname> <given-names>W. H.</given-names>
</name>
<name>
<surname>Steppe</surname> <given-names>K.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture</article-title>. <source>Trends Plant Sci.</source> <volume>24</volume> (<issue>2</issue>), <fpage>152</fpage>&#x2013;<lpage>164</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.tplants.2018.11.007</pub-id>
</citation>
</ref>
<ref id="B34">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maimaitijiang</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Ghulam</surname> <given-names>A.</given-names>
</name>
<name>
<surname>Sidike</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Hartling</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Maimaitiyiming</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Peterson</surname> <given-names>K.</given-names>
</name>
<etal/>
</person-group>. (<year>2017</year>). <article-title>Unmanned aerial system (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>134</volume>, <fpage>43</fpage>&#x2013;<lpage>58</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2017.10.011</pub-id>
</citation>
</ref>
<ref id="B35">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maimaitijiang</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Sagan</surname> <given-names>V.</given-names>
</name>
<name>
<surname>Sidike</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Hartling</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Esposito</surname> <given-names>F.</given-names>
</name>
<name>
<surname>Fritschi</surname> <given-names>F. B.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Soybean yield prediction from UAV using multimodal data fusion and deep learning</article-title>. <source>Remote Sens. Environ.</source> <volume>237</volume>, <elocation-id>111599</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2019.111599</pub-id>
</citation>
</ref>
<ref id="B36">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maimaitijiang</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Sagan</surname> <given-names>V.</given-names>
</name>
<name>
<surname>Sidike</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Maimaitiyiming</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Hartling</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Peterson</surname> <given-names>K. T.</given-names>
</name>
<etal/>
</person-group>. (<year>2019</year>). <article-title>Vegetation index weighted canopy volume model (CVMVI) for soybean biomass estimation from unmanned aerial system-based RGB imagery</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>151</volume>, <fpage>27</fpage>&#x2013;<lpage>41</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2019.03.003</pub-id>
</citation>
</ref>
<ref id="B37">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maresma</surname> <given-names>&#xc1;.</given-names>
</name>
<name>
<surname>Ariza</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Mart&#xed;nez</surname> <given-names>E.</given-names>
</name>
<name>
<surname>Lloveras</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Mart&#xed;nez-Casasnovas</surname> <given-names>J.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Analysis of vegetation indices to determine nitrogen application and yield prediction in maize (Zea mays L.) From a standard UAV service</article-title>. <source>Remote Sens.</source> <volume>8</volume> (<issue>12</issue>), <elocation-id>973</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.3390/rs8120973</pub-id>
</citation>
</ref>
<ref id="B38">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Marques Ramos</surname> <given-names>A. P.</given-names>
</name>
<name>
<surname>Prado Osco</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Elis Garcia Furuya</surname> <given-names>D.</given-names>
</name>
<name>
<surname>Nunes Gon&#xe7;alves</surname> <given-names>W.</given-names>
</name>
<name>
<surname>Cordeiro Santana</surname> <given-names>D.</given-names>
</name>
<name>
<surname>Pereira Ribeiro Teodoro</surname> <given-names>L.</given-names>
</name>
<etal/>
</person-group>. (<year>2020</year>). <article-title>A random forest ranking approach to predict yield in maize with UAV-based vegetation spectral indices</article-title>. <source>Comput. Electron. Agric.</source> <volume>178</volume>, <elocation-id>105791</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.compag.2020.105791</pub-id>
</citation>
</ref>
<ref id="B39">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Meyer</surname> <given-names>G. E.</given-names>
</name>
<name>
<surname>Neto</surname> <given-names>J. C.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Verification of color vegetation indices for automated crop imaging applications</article-title>. <source>Comput. Electron. Agric.</source> <volume>63</volume> (<issue>2</issue>), <fpage>282</fpage>&#x2013;<lpage>293</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.compag.2008.03.009</pub-id>
</citation>
</ref>
<ref id="B40">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Misiou</surname> <given-names>O.</given-names>
</name>
<name>
<surname>Koutsoumanis</surname> <given-names>K.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Climate change and its implications for food safety and spoilage</article-title>. <source>Trends Food Sci. Technol.</source> <volume>126</volume>, <fpage>142</fpage>&#x2013;<lpage>152</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.tifs.2021.03.031</pub-id>
</citation>
</ref>
<ref id="B41">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mora</surname> <given-names>C.</given-names>
</name>
<name>
<surname>Spirandelli</surname> <given-names>D.</given-names>
</name>
<name>
<surname>Franklin</surname> <given-names>E. C.</given-names>
</name>
<name>
<surname>Lynham</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Kantar</surname> <given-names>M. B.</given-names>
</name>
<name>
<surname>Miles</surname> <given-names>W.</given-names>
</name>
<etal/>
</person-group>. (<year>2018</year>). <article-title>Broad threat to humanity from cumulative climate hazards intensified by greenhouse gas emissions</article-title>. <source>Nat. Clim. Change</source> <volume>8</volume> (<issue>12</issue>), <fpage>1062</fpage>&#x2013;<lpage>1071</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1038/s41558-018-0315-6</pub-id>
</citation>
</ref>
<ref id="B42">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mountrakis</surname> <given-names>G.</given-names>
</name>
<name>
<surname>Im</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Ogole</surname> <given-names>C.</given-names>
</name>
</person-group> (<year>2011</year>). <article-title>Support vector machines in remote sensing: a review</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>66</volume> (<issue>3</issue>), <fpage>247</fpage>&#x2013;<lpage>259</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2010</pub-id>
</citation>
</ref>
<ref id="B43">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Qader</surname> <given-names>S. H.</given-names>
</name>
<name>
<surname>Utazi</surname> <given-names>C. E.</given-names>
</name>
<name>
<surname>Priyatikanto</surname> <given-names>R.</given-names>
</name>
<name>
<surname>Najmaddin</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Hama-Ali</surname> <given-names>E. O.</given-names>
</name>
<name>
<surname>Khwarahm</surname> <given-names>N. R.</given-names>
</name>
<etal/>
</person-group>. (<year>2023</year>). <article-title>Exploring the use of Sentinel-2 datasets and environmental variables to model wheat crop yield in smallholder arid and semi-arid farming systems</article-title>. <source>Sci. Total Environ.</source> <volume>869</volume>, <elocation-id>161716</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.scitotenv.2023.161716</pub-id>
</citation>
</ref>
<ref id="B44">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rischbeck</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Elsayed</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Mistele</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Barmeier</surname> <given-names>G.</given-names>
</name>
<name>
<surname>Heil</surname> <given-names>K.</given-names>
</name>
<name>
<surname>Schmidhalter</surname> <given-names>U.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Data fusion of spectral, thermal and canopy height parameters for improved yield prediction of drought stressed spring barley</article-title>. <source>Eur. J. Agron.</source> <volume>78</volume>, <fpage>44</fpage>&#x2013;<lpage>59</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.eja.2016.04.013</pub-id>
</citation>
</ref>
<ref id="B45">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rouse</surname> <given-names>J. W.</given-names>
<suffix>Jr.</suffix>
</name>
<name>
<surname>Haas</surname> <given-names>R. H.</given-names>
</name>
<name>
<surname>Deering</surname> <given-names>D. W.</given-names>
</name>
<name>
<surname>Schell</surname> <given-names>J. A.</given-names>
</name>
<name>
<surname>Harlan</surname> <given-names>J. C.</given-names>
</name>
</person-group> (<year>1974</year>). <article-title>Monitoring the vernal advancement and retrogradation (green wave effect) of natural vegetation</article-title>.</citation>
</ref>
<ref id="B46">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sagan</surname> <given-names>V.</given-names>
</name>
<name>
<surname>Maimaitijiang</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Bhadra</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Maimaitiyiming</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Brown</surname> <given-names>D. R.</given-names>
</name>
<name>
<surname>Sidike</surname> <given-names>P.</given-names>
</name>
<etal/>
</person-group>. (<year>2021</year>). <article-title>Field-scale crop yield prediction using multi-temporal Worldview-3 and Planetscope satellite data and deep learning</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>174</volume>, <fpage>265</fpage>&#x2013;<lpage>281</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2021.02.008</pub-id>
</citation>
</ref>
<ref id="B47">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sakamoto</surname> <given-names>T.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Incorporating environmental variables into a MODIS-based crop yield estimation method for United States corn and soybeans through the use of a random forest regression algorithm</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>160</volume>, <fpage>208</fpage>&#x2013;<lpage>228</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2019.12.012</pub-id>
</citation>
</ref>
<ref id="B48">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shahhosseini</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Hu</surname> <given-names>G.</given-names>
</name>
<name>
<surname>Archontoulis</surname> <given-names>S. V.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Forecasting corn yield with machine learning ensembles</article-title>. <source>Front. Plant Sci.</source> <volume>11</volume>. doi:&#xa0;<pub-id pub-id-type="doi">10.3389/fpls.2020.01120</pub-id>
</citation>
</ref>
<ref id="B49">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shu</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Shen</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Dong</surname> <given-names>Q.</given-names>
</name>
<name>
<surname>Yang</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Li</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Ma</surname> <given-names>Y.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Estimating maize above-ground biomass by constructing the tridimensional concept model based on UAV-based digital and multi-spectral images</article-title>. <source>Field Crop Res.</source> <volume>282</volume>, <fpage>108491</fpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.fcr.2022.108491</pub-id>
</citation>
</ref>
<ref id="B50">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Su</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Huang</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Fischer</surname> <given-names>T.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Kundzewicz</surname> <given-names>Z. W.</given-names>
</name>
<name>
<surname>Zhai</surname> <given-names>J.</given-names>
</name>
<etal/>
</person-group>. (<year>2018</year>). <article-title>Drought losses in China might double between the 1.5&#xb0;c and 2.0&#xb0;c warming</article-title>. <source>Proc. Natl. Acad. Sci.</source> <volume>115</volume> (<issue>42</issue>), <fpage>10600</fpage>&#x2013;<lpage>10605</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1073/pnas.1802129115</pub-id>
</citation>
</ref>
<ref id="B51">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sulik</surname> <given-names>J. J.</given-names>
</name>
<name>
<surname>Long</surname> <given-names>D. S.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Spectral considerations for modeling yield of canola</article-title>. <source>Remote Sens. Environ.</source> <volume>184</volume>, <fpage>161</fpage>&#x2013;<lpage>174</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2016.06.016</pub-id>
</citation>
</ref>
<ref id="B52">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tanabe</surname> <given-names>R.</given-names>
</name>
<name>
<surname>Matsui</surname> <given-names>T.</given-names>
</name>
<name>
<surname>Tanaka</surname> <given-names>T. S. T.</given-names>
</name>
</person-group> (<year>2023</year>). <article-title>Winter wheat yield prediction using convolutional neural networks and UAV-based multispectral imagery</article-title>. <source>Field Crop Res.</source> <volume>291</volume>, <elocation-id>108786</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.fcr.2022.108786</pub-id>
</citation>
</ref>
<ref id="B53">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>van Klompenburg</surname> <given-names>T.</given-names>
</name>
<name>
<surname>Kassahun</surname> <given-names>A.</given-names>
</name>
<name>
<surname>Catal</surname> <given-names>C.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Crop yield prediction using machine learning: a systematic literature review</article-title>. <source>Comput. Electron. Agric.</source> <volume>177</volume>, <elocation-id>105709</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.compag.2020.105709</pub-id>
</citation>
</ref>
<ref id="B54">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Walter</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Edwards</surname> <given-names>J.</given-names>
</name>
<name>
<surname>McDonald</surname> <given-names>G.</given-names>
</name>
<name>
<surname>Kuchel</surname> <given-names>H.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Photogrammetry for the estimation of wheat biomass and harvest index</article-title>. <source>Field Crop Res.</source> <volume>216</volume>, <fpage>165</fpage>&#x2013;<lpage>174</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.fcr.2017.11.024</pub-id>
</citation>
</ref>
<ref id="B55">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wan</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Cen</surname> <given-names>H.</given-names>
</name>
<name>
<surname>Zhu</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Zhang</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Zhu</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Sun</surname> <given-names>D.</given-names>
</name>
<etal/>
</person-group>. (<year>2020</year>). <article-title>Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer &#x2013; a case study of small farmlands in the South of China</article-title>. <source>Agric. For. Meteorol.</source> <volume>291</volume>, <elocation-id>108096</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.agrformet.2020.108096</pub-id>
</citation>
</ref>
<ref id="B56">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wang</surname> <given-names>F.</given-names>
</name>
<name>
<surname>Yi</surname> <given-names>Q.</given-names>
</name>
<name>
<surname>Hu</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Xie</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Yao</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Xu</surname> <given-names>T.</given-names>
</name>
<etal/>
</person-group>. (<year>2021</year>). <article-title>Combining spectral and textural information in UAV hyperspectral images to estimate rice grain yield</article-title>. <source>Int. J. Appl. Earth Obs. Geoinf.</source> <volume>102</volume>, <elocation-id>102397</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.jag.2021.102397</pub-id>
</citation>
</ref>
<ref id="B57">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Weiss</surname> <given-names>M.</given-names>
</name>
<name>
<surname>Jacob</surname> <given-names>F.</given-names>
</name>
<name>
<surname>Duveiller</surname> <given-names>G.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Remote sensing for agricultural applications: a meta-review</article-title>. <source>Remote Sens. Environ.</source> <volume>236</volume>, <elocation-id>111402</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.rse.2019.111402</pub-id>
</citation>
</ref>
<ref id="B58">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Woebbecke</surname> <given-names>D. M.</given-names>
</name>
<name>
<surname>Meyer</surname> <given-names>G. E.</given-names>
</name>
<name>
<surname>Von Bargen</surname> <given-names>K.</given-names>
</name>
<name>
<surname>Mortensen</surname> <given-names>D. A.</given-names>
</name>
</person-group> (<year>1995</year>). <article-title>Color indices for weed identification under various soil, residue, and lighting conditions</article-title>. <source>Trans. ASAE.</source> <volume>38</volume> (<issue>1</issue>), <fpage>259</fpage>&#x2013;<lpage>269</lpage>. doi: <pub-id pub-id-type="doi">10.13031/2013.27838</pub-id>
</citation>
</ref>
<ref id="B59">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wu</surname> <given-names>C.</given-names>
</name>
<name>
<surname>Niu</surname> <given-names>Z.</given-names>
</name>
<name>
<surname>Tang</surname> <given-names>Q.</given-names>
</name>
<name>
<surname>Huang</surname> <given-names>W.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Estimating chlorophyll content from hyperspectral vegetation indices: modeling and validation</article-title>. <source>Agric. For. Meteorol.</source> <volume>148</volume> (<issue>8-9</issue>), <fpage>1230</fpage>&#x2013;<lpage>1241</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.agrformet.2008.03.005</pub-id>
</citation>
</ref>
<ref id="B60">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Xia</surname> <given-names>F.</given-names>
</name>
<name>
<surname>Quan</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Lou</surname> <given-names>Z.</given-names>
</name>
<name>
<surname>Sun</surname> <given-names>D.</given-names>
</name>
<name>
<surname>Li</surname> <given-names>H.</given-names>
</name>
<name>
<surname>Lv</surname> <given-names>X.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Identification and comprehensive evaluation of resistant weeds using unmanned aerial vehicle-based multispectral imagery</article-title>. <source>Front. Plant Sci.</source> <volume>13</volume>. doi:&#xa0;<pub-id pub-id-type="doi">10.3389/fpls.2022.938604</pub-id>
</citation>
</ref>
<ref id="B61">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Xin</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Bo</surname> <given-names>C.</given-names>
</name>
<name>
<surname>Zhao</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Wu</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Li</surname> <given-names>H.</given-names>
</name>
<name>
<surname>Su</surname> <given-names>J.</given-names>
</name>
<etal/>
</person-group>. (<year>2008</year>). <source>GB 4404.1-2008 Seed of food crops-Part 1: Cereals</source> (<publisher-loc>Beijing</publisher-loc>: <publisher-name>Standards Press of China</publisher-name>).</citation>
</ref>
<ref id="B62">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Xu</surname> <given-names>W.</given-names>
</name>
<name>
<surname>Chen</surname> <given-names>P.</given-names>
</name>
<name>
<surname>Zhan</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Chen</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Zhang</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Lan</surname> <given-names>Y.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Cotton yield estimation model based on machine learning using time series UAV remote sensing data</article-title>. <source>Int. J. Appl. Earth Obs. Geoinf.</source> <volume>104</volume>, <elocation-id>102511</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.jag.2021.102511</pub-id>
</citation>
</ref>
<ref id="B63">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yan</surname> <given-names>G.</given-names>
</name>
<name>
<surname>Li</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Coy</surname> <given-names>A.</given-names>
</name>
<name>
<surname>Mu</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Chen</surname> <given-names>S.</given-names>
</name>
<name>
<surname>Xie</surname> <given-names>D.</given-names>
</name>
<etal/>
</person-group>. (<year>2019</year>). <article-title>Improving the estimation of fractional vegetation cover from UAV RGB imagery by colour unmixing</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>158</volume>, <fpage>23</fpage>&#x2013;<lpage>34</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2019.09.017</pub-id>
</citation>
</ref>
<ref id="B64">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yang</surname> <given-names>Q.</given-names>
</name>
<name>
<surname>Shi</surname> <given-names>L.</given-names>
</name>
<name>
<surname>Han</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Zha</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Zhu</surname> <given-names>P.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images</article-title>. <source>Field Crop Res.</source> <volume>235</volume>, <fpage>142</fpage>&#x2013;<lpage>153</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.fcr.2019.02.022</pub-id>
</citation>
</ref>
<ref id="B65">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yue</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Tian</surname> <given-names>Q.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Estimating fractional cover of crop, crop residue, and soil in cropland using broadband remote sensing data and machine learning</article-title>. <source>Int. J. Appl. Earth Obs. Geoinf.</source> <volume>89</volume>, <elocation-id>102089</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.jag.2020.102089</pub-id>
</citation>
</ref>
<ref id="B66">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zeng</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Hao</surname> <given-names>D.</given-names>
</name>
<name>
<surname>Huete</surname> <given-names>A.</given-names>
</name>
<name>
<surname>Dechant</surname> <given-names>B.</given-names>
</name>
<name>
<surname>Berry</surname> <given-names>J.</given-names>
</name>
<name>
<surname>Chen</surname> <given-names>J. M.</given-names>
</name>
<etal/>
</person-group>. (<year>2022</year>). <article-title>Optical vegetation indices for monitoring terrestrial ecosystems globally</article-title>. <source>Nat. Rev. Earth Environ.</source> <volume>3</volume>, <fpage>477</fpage>&#x2013;<lpage>493</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1038/s43017-022-00298-5</pub-id>
</citation>
</ref>
<ref id="B67">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname> <given-names>Y.</given-names>
</name>
<name>
<surname>Xia</surname> <given-names>C.</given-names>
</name>
<name>
<surname>Zhang</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Cheng</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Feng</surname> <given-names>G.</given-names>
</name>
<name>
<surname>Wang</surname> <given-names>Y.</given-names>
</name>
<etal/>
</person-group>. (<year>2021</year>). <article-title>Estimating the maize biomass by crop height and narrowband vegetation indices derived from UAV-based hyperspectral images</article-title>. <source>Ecol. Indic.</source> <volume>129</volume>, <elocation-id>107985</elocation-id>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.ecolind.2021.107985</pub-id>
</citation>
</ref>
<ref id="B68">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhou</surname> <given-names>X.</given-names>
</name>
<name>
<surname>Zheng</surname> <given-names>H. B.</given-names>
</name>
<name>
<surname>Xu</surname> <given-names>X. Q.</given-names>
</name>
<name>
<surname>He</surname> <given-names>J. Y.</given-names>
</name>
<name>
<surname>Ge</surname> <given-names>X. K.</given-names>
</name>
<name>
<surname>Yao</surname> <given-names>X.</given-names>
</name>
<etal/>
</person-group>. (<year>2017</year>). <article-title>Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery</article-title>. <source>ISPRS-J. Photogramm. Remote Sens.</source> <volume>130</volume>, <fpage>246</fpage>&#x2013;<lpage>255</lpage>. doi:&#xa0;<pub-id pub-id-type="doi">10.1016/j.isprsjprs.2017.05.003</pub-id>
</citation>
</ref>
</ref-list>
</back>
</article>