REVIEW article

Front. Plant Sci., 09 April 2021

Sec. Technical Advances in Plant Science

Volume 12 - 2021 | https://doi.org/10.3389/fpls.2021.616689

Applications of UAS in Crop Biomass Monitoring: A Review

  • 1. College of Mechanical Engineering, Guangxi University, Nanning, China

  • 2. College of Civil Engineering and Architecture, Guangxi University, Nanning, China

  • 3. Guangdong Laboratory of Lingnan Modern Agriculture, Shenzhen, Genome Analysis Laboratory of the Ministry of Agriculture and Rural Area, Agricultural Genomics Institute at Shenzhen, Chinese Academy of Agricultural Sciences, Shenzhen, China

  • 4. Guangzhou Key Laboratory of Agricultural Products Quality & Safety Traceability Information Technology, Zhongkai University of Agriculture and Engineering, Guangzhou, China

Article metrics

View details

63

Citations

11,3k

Views

3,1k

Downloads

Abstract

Biomass is an important indicator for evaluating crops. The rapid, accurate and nondestructive monitoring of biomass is the key to smart agriculture and precision agriculture. Traditional detection methods are based on destructive measurements. Although satellite remote sensing, manned airborne equipment, and vehicle-mounted equipment can nondestructively collect measurements, they are limited by low accuracy, poor flexibility, and high cost. As nondestructive remote sensing equipment with high precision, high flexibility, and low-cost, unmanned aerial systems (UAS) have been widely used to monitor crop biomass. In this review, UAS platforms and sensors, biomass indices, and data analysis methods are presented. The improvements of UAS in monitoring crop biomass in recent years are introduced, and multisensor fusion, multi-index fusion, the consideration of features not directly related to monitoring biomass, the adoption of advanced algorithms and the use of low-cost sensors are reviewed to highlight the potential for monitoring crop biomass with UAS. Considering the progress made to solve this type of problem, we also suggest some directions for future research. Furthermore, it is expected that the challenge of UAS promotion will be overcome in the future, which is conducive to the realization of smart agriculture and precision agriculture.

Introduction

Agriculture plays an important role in maintaining all human activities. By 2050, population and socioeconomic growth are expected to double the current food demand (Niu et al., 2019). To solve the increasingly complex problems in the agricultural production system, the development of smart agriculture and precision agriculture provides important tools for meeting the challenges of sustainable agricultural development (Sharma et al., 2020). Biomass is a basic agronomic parameter in field investigations and is often used to indicate crop growth status, the effectiveness of agricultural management measures and the carbon sequestration ability of crops (Bendig et al., 2015; Li W. et al., 2015). Fast, accurate and nondestructive monitoring of biomass is the key to smart agriculture and precision agriculture (Lu et al., 2019; Yuan et al., 2019).

Traditional biomass measurement methods are based on destructive measurements that require the manual harvesting (Gnyp et al., 2014), weighing and recording of crops, which makes large-scale, long-term measurements challenging and time-consuming, and these measurements are not only time-consuming and laborious but also difficult to apply over large areas (Boschetti et al., 2007; Yang et al., 2017). In other research areas, many studies have used satellite remote sensing to monitor biomass. Navarro et al. (2019) used Sentinel-1 and Sentinel-2 data to monitor the aboveground biomass (AGB) of a mangrove plantation. However, meteorological conditions have a great influence on satellite images, such as cloud and aerosol interference, surface glare and poor synchrony with tides (Tait et al., 2019). In addition, satellite data cannot provide sufficient data resolution for precision agricultural applications (Jiang et al., 2019; Song et al., 2020), and it is difficult to obtain timely and reliable data (Prey and Schmidhalter, 2019). Similar to satellite remote sensing, manned airborne equipment can cover a wide range, but the data are not detailed enough (Sofonia et al., 2019; ten Harkel et al., 2020). Meanwhile, although vehicle-mounted equipment can guarantee high accuracy, it has poor flexibility and slow speed (Selbeck et al., 2010; Tian et al., 2020). Unmanned aerial systems (UAS) represent a noncontact and nondestructive measurement method that can obtain the spectral, structural, and texture features of the target at different spatiotemporal scales (Jiang et al., 2019). These systems have the ability to obtain high spatial and temporal resolution data and have great application potential (Niu et al., 2019; Ramon Saura et al., 2019).

To date, most reviews of UAS in the field of agriculture are general reviews involving multiple fields in agriculture, and the description of biomass monitoring is not detailed enough (Hassler and Baysal-Gurel, 2019; Kim et al., 2019; Maes and Steppe, 2019). Reviews of remote sensing for crop biomass monitoring are rare and mainly introduce satellite remote sensing, while the application of UAS in crop biomass monitoring is rarely introduced (Chao et al., 2019). Therefore, the motivation of our study was to conduct a comprehensive review of almost all UAS-related studies in the field of crop biomass monitoring, including information on the equipment used in the field of crop biomass monitoring, biomass indices, and data processing and analysis methods. Finally, the relevant applications are reviewed according to different development directions.

The Composition of UAS

Unmanned aerial systems consist of unmanned aerial vehicle (UAV) platforms, autopilot systems, navigation sensors, mechanical steering components, data acquisition sensors, and other components (Jeziorska, 2019), among which the most important are the data acquisition sensors (Toth and Jozkow, 2016). Meanwhile, the type of UAV platforms and flight conditions will have a great impact on the data acquisition process of sensors, which need to be considered (Domingo et al., 2019; ten Harkel et al., 2020).

UAV Platforms

The most commonly used platforms in crop biomass monitoring are fixed-wing drones and rotor drones (Hassler and Baysal-Gurel, 2019). Hogan et al. (2017) summarized the characteristics of fixed-wing aircrafts and rotorcrafts. Fixed-wing aircrafts usually have a larger payload capacity, faster flight speed, longer flight time, and longer range than rotorcrafts. For these reasons, fixed-wing systems are particularly useful for collecting data over large areas. Fixed-wing aircrafts have poor mobility, need more space to land, and have more expensive prices than rotor UAVs. Rotor UAVs are very maneuverable and can hover, rotate and take pictures at almost any angle. Although there are also expensive models, more low-cost models have widely appeared in the market. Compared with fixed-wing aircrafts, the main disadvantage of rotor UAVs is their short range and flight time. Figure 1 shows DJI Inspire 2 Rotor Drone1 and eBee X Fixed-Wing Drone2.

FIGURE 1

The flight planning of a fixed-wing UAV is very similar to that of a manned aircraft, while a rotor UAV can meet almost any trajectory requirements, including hover, slow motion and attitude control (Toth and Jozkow, 2016). These features enable rotor UAVs to perform extremely accurate tasks (Kim et al., 2019). Therefore, rotor UAVs are more commonly used in biomass monitoring than fixed-wing aircrafts.

Data Acquisition Sensors

Unmanned aerial systems usually obtain data through spectral sensors and depth sensors (Toth and Jozkow, 2016). Spectral sensors mainly include RGB sensors, multispectral sensors, and hyperspectral sensors, which can obtain color and texture information from the crop surface (Li et al., 2019). The difference between these three types of sensors is their ability to sense the spectrum (Shentu et al., 2018; Zhong et al., 2018; Kelly et al., 2019). Light detection and ranging (LiDAR) is a typical example of a depth sensor and can clearly obtain the three-dimensional structure and height information of crops (Wijesingha et al., 2019). Figure 2 shows several UAV-mounted sensor types.

FIGURE 2

Spectral Sensors

Based on the same imaging principle, RGB, multispectral, and hyperspectral sensors all capture images by sensing spectral bands but have abilities to sense different spectral bands.

RGB sensors are a type of visible light camera that can detect three bands of color: red (R), green (G), and blue (B) (Shentu et al., 2018). The data from the three bands represent the intensity of R, G, and B in each pixel (Tait et al., 2019). Although RGB sensors have low accuracy compared with other sensors because they can collect spectral data from only three bands, the low-cost characteristic of RGB sensors is not possessed by others. Due to the need for low cost during the large-scale use of UAS in the monitoring of crop biomass, RGB sensors have received increasing attention because of their low-cost characteristics (Calou et al., 2019; Lu et al., 2019; Yue et al., 2019). The combination of RGB data with better biomass indices and advanced algorithms can obtain high accuracy at a low cost (Acorsi et al., 2019; Lu et al., 2019; Yue et al., 2019). Therefore, in the field of crop biomass monitoring by UAS, RGB sensors play an irreplaceable role.

Since spectral information is lost during the process of color image recording, the use of RGB input obviously limits the amount of information to extract from the highlighted area (Kelly et al., 2019). Compared with three-channel RGB imaging, multispectral images contain more imaging bands (Shentu et al., 2018).

Multispectral image data containing several near-infrared (NIR) spectral regions are superior to RGB data (Cen et al., 2019), but the disadvantage is that the cost of multispectral sensors is higher than that of RGB sensors (Costa et al., 2020). Different spectral bands can reflect the characteristics of different plants and can be used to effectively distinguish different crops (Xu et al., 2019). Song and Park (2020) used the RedEdge multispectral camera from MicaSense to analyze the spectral characteristics of aquatic plants and found that waterside plants exhibited the highest reflectivity in the NIR band, while floating plants had high reflectivity in the red-edge band.

Hyperspectral sensors can obtain more abundant spectral information than multispectral sensors (Zhong et al., 2018). Yue et al. (2018) used the UHD 185 Firefly (UHD 185 Firefly, Cubert GmbH, Ulm, Baden-Württemberg, Germany) hyperspectral sensor to collect panchromatic images with radiation records of 1000 × 1000 (1 band) and hyperspectral cubes of 50 × 50 (125 bands), with rich texture and spectral information. The disadvantage is that the cost of hyperspectral sensors is higher than that of multispectral sensors. In addition, the spatial resolution of hyperspectral images is lower than that of ordinary images, which may cause the loss of detail information for small targets. Meanwhile, more spectral information may not be useful in some cases. Tao et al. (2020) used hyperspectral sensors to study the correlation between different vegetation indices (VIs) and red-edge parameters and crop biomass. It was found that using too many spectral features as independent variables will lead to overfitting of the model, so it is necessary to use an appropriate number of spectral features that are highly related to biomass.

How to improve the reliability of spectral data is an unavoidable problem when using spectral sensors to collect data. First, the image resolution will affect the results of AGB monitoring, and the higher the image resolution is, the higher the prediction accuracy (Domingo et al., 2019). Yue et al. (2019) found that using image texture information to estimate the best image resolution for AGB monitoring depends on the size and row spacing of the crop canopy. Second, fisheye lenses may have an advantage over flat lenses. Calou et al. (2019) coupled a 16-megapixel plane lens with a 12-megapixel fisheye lens on a UAV for data collection, and the results showed that the fisheye lens estimation was the most accurate at an altitude of 30 m. Finally, at present, some applications using spectral data are processed without accurate or rough calibration. Guo et al. (2019) proposed a general calibration equation that is suitable for images under clear sky conditions and even under a small amount of clouds. The method needs to be further verified.

Although RGB sensors can only collect spectral data from the R, G, and B bands, the equipment is inexpensive. Although hyperspectral sensors can collect spectral information from many bands, the equipment is expensive. Unless there are special requirements for detailed hyperspectral images and the equipment is inexpensive, a multispectral sensor that balances the richness of spectral bands and equipment costs exhibits the highest cost performance and should be the default imaging choice (Hassler and Baysal-Gurel, 2019; Niu et al., 2019).

Light Detection and Ranging

Spectral data have poor robustness in the case of target overlap, occlusion, large illumination changes, shadows, and complex scenes. Depth data that do not change with brightness and color can provide additional useful information for complex scenes (Shuqin et al., 2016). At present, the depth sensors on UAV platforms are mainly LiDAR (Wallace et al., 2012; Qiu et al., 2019; Tian et al., 2019; Wang D.Z. et al., 2019; Yan et al., 2020). LiDAR has become an important information source for the evaluation of the vegetation canopy structure, which is especially suitable for species that limit artificial and destructive sampling (Brede et al., 2017). Figure 3 shows a schematic illustration of the difference between LiDAR and spectral data (Zhu Y.H. et al., 2019).

FIGURE 3

Light detection and ranging is an active remote sensing technology that accurately measures distance by emitting laser pulses and analyzing the returned energy (Calders et al., 2015). With the development of global positioning system (GPS), inertial measurement unit (IMU), laser and computing technology, which make it possible to use LiDAR more inexpensively and accurately, a LiDAR system based on a UAV platform has become possible (Lohani and Ghosh, 2017). Compared with spectral sensors, LiDAR tends to provide accurate results of biomass prediction (Acorsi et al., 2019) because spectral data tend to be saturated in the middle and high canopy (Féret et al., 2017), and LiDAR can improve this through depth information (Hassler and Baysal-Gurel, 2019).

However, when it is difficult to estimate plant height, it is difficult to accurately monitor biomass through LiDAR. ten Harkel et al. (2020) used a VUX-SYS laser scanner to monitor the biomass of potato, sugar beet, and winter wheat. The researchers achieved good results in monitoring the biomass of sugar beet (R2 = 0.68, RMSE = 17.47 g/m2) and winter wheat (R2 = 0.82, RMSE = 13.94 g/m2), but the reliability for monitoring potato biomass was low. The reason for this result is that potatoes have complex canopy structures and grow on a ridge, and the other two crops have vertical structures and uniform heights. Therefore, for potatoes, it is difficult to visually determine the highest point of a specific position.

Finally, how to improve the data reliability by adjusting the LiDAR parameters is still lacking in more research. For instance, the sampling intensity of LiDAR has an impact on the accuracy of monitoring biomass (Wang D.Z. et al., 2020), but the current research in this area needs to be further verified.

Multisensor Fusion

The combination of data obtained from multiple sensors is an effective method to improve the accuracy of biomass estimation. On the one hand, the density of LiDAR point clouds has been improved with increased data resolution and penetrability (Wallace et al., 2012; Yan et al., 2020), which can improve the disadvantage that spectral data collected by RGB, multispectral and hyperspectral sensors are easily saturated in the middle and high canopy (Gitelson, 2004; Féret et al., 2017). On the other hand, the texture and spectral features that can be collected by RGB, multispectral, and hyperspectral sensors are also beyond the reach of LiDAR (Liu et al., 2019; Yue et al., 2019; Zheng et al., 2019). A variety of sensors with different characteristics are used to collect data, and the data that can reflect different characteristics of target crops are combined to provide more effective characteristics that are not cross-correlated that are needed for data analysis with a regression algorithm (Niu et al., 2019) to improve the accuracy of biomass estimation.

Wang et al. (2017) first evaluated the application of the fusion of hyperspectral and LiDAR data in maize biomass estimation. The results show that the fusion of hyperspectral and LiDAR data can provide better estimates of maize biomass than using LiDAR or hyperspectral data alone. Different from the previous methods of using LiDAR and optical remote sensing data to predict AGB separately or in combination, Zhu Y.H. et al. (2019) divided the estimation of maize AGB into two parts: aboveground leaf biomass (AGLB) and aboveground stem biomass (AGSB). AGLB was measuring with multispectral data, which are sensitive to the vegetation canopy. AGSB was measured with LiDAR point cloud data, which are sensitive to the vegetation structure. Compared with using LiDAR data alone or using multispectral data alone, the combination of LiDAR data and multispectral data can more accurately estimate AGB, in which the R2 increases by 0.13 and 0.30, the RMSE decreases by 22.89 and 54.92 g/m2, and the NRMSE decreases by 4.46 and 7.65%.

Other researchers have also carried out many studies in the field of multisensor data fusion and obtained the same results in studies on the monitoring of crop biomass, such as rice (Cen et al., 2019) and soybean (Maimaitijiang et al., 2020). In addition, different crops have different characteristics, and the same crop will show different characteristics under different growth conditions (Johansen et al., 2019; ten Harkel et al., 2020), which requires the use of different sensors to collect crop information comprehensively and screen out some information most related to biomass. The combination of data from multiple sensors is an effective method to improve the accuracy of biomass estimation.

Flight Parameters

To ensure the most accurate results for biomass monitoring, further tests should be carried out before experiments to determine the optimal flight parameters, such as altitude, speed, location of flight lines, and overlap (Domingo et al., 2019). Increasing the UAV flight height will reduce image resolution (Lu et al., 2019), and the sensitivity of the accuracy of biomass monitoring to the image spatial resolution is an important reference for the configuration of a UAV flight height. The estimation of the other flight parameters did not exhibit much different effects on the overall effect of biomass monitoring from that of standard flight parameters, but different flight parameters can lead to different point densities and distributions, which have a greater impact on biomass monitoring than altitude and velocity. A better crossover model and a closer flight path may improve biomass monitoring overall (ten Harkel et al., 2020). The images need to be overlapped sufficiently to improve the accuracy of biomass monitoring (Borra-Serrano et al., 2019). Therefore, the UAV flight plan should be wide enough. Domingo et al. (2019) found that reducing side overlap from 80 to 70% while maintaining a fixed forward overlap of 90% may be an option to reduce flight time and procurement costs. For specific species, such as rice, due to the physiological characteristics of rice, the analysis of the solar elevation angle during the creation of a flight plan is very important to avoid the influence of sun glint and hotspot effects (Jiang et al., 2019).

Biomass Indices

It is a common method in biomass monitoring to use biomass indices to obtain data directly related to biomass. Common biomass indices include VIs and crop height (CH), which are extracted from images or three-dimensional point clouds. There are also relevant studies that do not use biomass indices but directly use images or three-dimensional point clouds to conduct correlation analysis with biomass (Nevavuori et al., 2019).

Vegetation Indices

A variety of VIs from remote sensing images can be used to monitor the state of vegetation on the ground. This method is also able to quantitatively evaluate the richness, greenness and vitality of vegetation. After years of development, VIs can be divided into various monitoring and calculation methods, among which the most commonly used is the normalized difference vegetation index (NDVI) proposed by Rouse et al. (1974). The NDVI is usually used to reflect information such as vegetation cover and growth, and its calculation formula is as follows:

Near-infrared is the reflectance in NIR band, and R is the reflectance in red band. The value range of NDVI is (−1, 1). It is generally believed that an NDVI value less than 0 represents no vegetation coverage, while a value greater than 0.1 represents vegetation coverage (Li Z. et al., 2015). Since the index is positively correlated with the density of vegetation, the higher the NDVI value is, the higher the vegetation coverage will be.

Different VIs have unique characteristics, and more spectral features can be identified by using multiple VIs to obtain high monitoring accuracy. Marino and Alvino (2020) used the soil adjusted vegetation index (SAVI), NDVI and OSAVI to characterize 10 winter wheat varieties in a field at different growth stages and obtained optimal biomass monitoring results. Villoslada et al. (2020) combined 13 VIs to obtain the highest accuracy.

Vegetation indices can be built not only on the basis of spectral information but also on the basis of texture information. Texture is an important characteristic for identifying objects or image areas of interest. In several texture algorithms, the gray level co-occurrence matrix (GLCM), which includes variance (VAR), entropy (EN), data range (DR), homogeneity (HOM), second moment (SE), dissimilarity (DIS), contrast (CON), and correlation (COR), which are based on Haralick et al. (1973), is often used to test the effects of texture analysis from UAS data on biomass estimation potential (Zheng et al., 2019).

Vegetation indices based on image texture are usually combined with VIs based on spectral information to monitor crop biomass, and this combination can improve the accuracy of monitoring biomass significantly (Liu et al., 2019; Yue et al., 2019; Zheng et al., 2019). Zheng et al. (2019) predicted rice AGB using stepwise multiple linear regression (SMLR) in combination with VIs and image texture, and the results showed that the combination of texture information and spectral information significantly improved the accuracy of rice biomass estimations compared with the use of spectral information alone (R2 = 0.78, RMSE = 1.84 t/ha).

Previously, as the required data were obtained by satellites, the spectral data collected would be affected by clouds. When there was cloud cover in the observation area, the information received by the satellite-borne sensor would be all cloud information, instead of reflecting the local vegetation cover (Feng et al., 2009). The low altitude and flexibility of UAS solve this problem, making VIs more widely used. At present, VIs have become indispensable biomass indices for monitoring crop biomass. Common VIs are shown in Table 1.

TABLE 1

VIsFormulationFeaturesReferences
Ratio vegetation indexRVI = NIR / RMonitor the photosynthetically active biomass of plant canopies.Tucker, 1979
Green chlorophyll indexGCI = (NIR/G) − 1Estimation of spatially distributed chlorophyll content in crops.Gitelson et al., 2005
Red-edge chlorophyll indexRECI = (NIR / RE) − 1Estimation of spatially distributed chlorophyll content in crops.Gitelson et al., 2005
Normalized difference vegetation indexNDVI = (NIR − R)/(NIR + R)Quantitative measurement of vegetation conditions over broad regions.Rouse et al., 1974
Green normalized difference vegetation indexGNDVI = (NIR − G)/(NIR + G)Nondestructive chlorophyll estimation in leaves.Gitelson et al., 2003
Green-red vegetation indexGRVI = (G − R) / (G + R)Monitor the photosynthetically active biomass of plant canopies.Tucker, 1979
Normalized difference red-edgeNDRE = (NIR − RE) / (NIR + RE)Increases the sensitivity of NDVI to chlorophyll content by approximately fivefold.Gitelson and Merzlyak, 1997
Normalized difference red-edge indexNDREI = (RE − G) / (RE + G)Estimation of senescence rate at maturation stages.Hassan et al., 2018
Simplified canopy chlorophyll content indexSCCCI = NDRE / NDVIReal-time detection of nutrient status.Raper and Varco, 2015
Enhanced vegetation indexEVI = 2.5 × (NIR − R) / (1 + NIR − 2.4 × R)The EVI remains sensitive to canopy variations while the NDVI is asymptotically saturated in high biomass regions.Huete et al., 2002
Two-band enhanced vegetation indexEVI2 = 2.5 × (NIR − R) / (NIR + 2.4 × R + 1)A 2-band EVI (EVI2), without a blue band, which has the best similarity with the 3-band EVI (EVI).Jiang et al., 2008
Wide dynamic range vegetation indexWDRVI = (a × NIR − R) / (a × NIR + R) (a = 0.12)The sensitivity of the WDRVI to moderate-to-high LAI (between 2 and 6) was at least three times greater than that of the NDVI.Gitelson, 2004
Soil adjusted vegetation indexSAVI = (1 + L) (NIR − RE) / (NIR + RE + L)Almost eliminated soil-induced changes in vegetation index.Huete, 1988
Optimized soil adjusted vegetation indexOSAVI = (NIR − R) / (NIR − R + 0.16)Less sensitive to soil background and atmospheric effects.Rondeaux et al., 1996
Modified chlorophyll absorption in reflectance indexMCARI = [(RE − R) − 0.2 × (RE − G)] × (RE / R)Evaluate the nutrient variability over large fields quickly.Daughtry et al., 2000
MCARI/OSAVIMCARI / OSAVIEvaluate the nutrient variability over large fields quickly.Daughtry et al., 2000
Transformed chlorophyll absorption in reflectance indexTCARI = 3 × [(RE − R) − 0.2 × (RE − G) × (RE / R)]Minimizing LAI (vegetation parameter) influence and underlying soil (background) effects.Haboudane et al., 2002
TCARI/OSAVITCARI / OSAVIMinimizing LAI (vegetation parameter) influence and underlying soil (background) effects.Haboudane et al., 2002

Introduce the formulation and features of common VIs.

Generally, one or several kinds of VIs should be selected according to different situations. Abbreviate green, red, red-edge, and near-infrared to G, R, RE, and NIR in formulation.

Crop Height

Crop height is an important indicator to characterize the vertical structure, and CH is usually strongly correlated with biomass (Scotford and Miller, 2004; Prost and Jeuffroy, 2007; Salas Fernandez et al., 2009; Montes et al., 2011; Hakl et al., 2012; Alheit et al., 2014). The crop surface model (CSM) is an effective CH information extraction technique and has been widely used for different crops (Han et al., 2019).

Crop height data can be obtained using RGB sensors and multispectral sensors. Cen et al. (2019) established a CSM to determine the CH (Tilly et al., 2014) based on spliced RGB images. First, structure from motion (SfM) was used to generate a point cloud, and the specific steps can be found in the study of Tomasi and Kanade (1992). Point clouds consist of matching points between overlapping images such as crop canopies and topographic surfaces. A digital elevation model (DEM) and digital terrain model (DTM) were obtained by classifying the point clouds. The DEM was based on a complete dense point cloud representing the height of the crop canopy, while the DTM was developed from only the surface dense point cloud. The CSM could be obtained by subtracting the DTM from the DEM by importing the two models into ArcGIS software (ArcGIS, Esri Inc., Redlands, CA, United States). Hassan et al. (2019) used a Sequoia 4.0 multispectral camera with the same method to measure the CH of wheat, and the results showed that the correlation between the CH data from UAS and the actual height was very high (R2 = 0.96).

Crop height data can also be obtained using LiDAR. Zhu W.X. et al. (2019) used CloudCompare open-source software to construct CH raster data from the LiDAR point cloud and studied the effects of CH on monitoring the AGB of crops. The results showed that CH is a robust indicator that can be used to estimate biomass, and the high spatial resolution of the CH data set was helpful to improve the effect of maize AGB estimation.

The monitoring of crop biomass by a single biomass index is sometimes unreliable. On the one hand, it is difficult to obtain reliable CH data from LiDAR in some cases. Johansen et al. (2019) found that dust storms can cause tomato plants to flatten and that once the tomato fruits become large and heavy, the weight may cause the branches to bend downward, thereby reducing the height of the plants. ten Harkel et al. (2020) found that potatoes have complex canopy structures and grow on ridges, so it is difficult to visually determine the highest point of a specific position. In the above cases, VIs can achieve better results than other measurements. On the other hand, CH data can better reflect the three-dimensional information of crops and can more accurately reflect the biomass of crops in the scene of target overlap, occlusion, large changes in light, shadow, and complex scenes. In addition, the information collected by UAS includes not only target crops but also other interference information. If this interference information cannot be effectively eliminated, it will have a negative impact on the monitoring of crop biomass, which can be improved by the combination of multiple biomass indices (Niu et al., 2019). Therefore, the combination of multiple models for biomass estimation is an effective method to improve the accuracy of biomass estimation.

Multi-index Fusion

The combination of multiple models for biomass estimation is an effective method to improve the accuracy of biomass estimation. The combination of spectral and textural features to construct VIs or the combination of VIs and CH has been shown to improve the results of biomass estimation.

Based on the idea of combining VIs with CH, Cen et al. (2019) used a biomass model that combined VIs and CH to monitor rice biomass under different nitrogen treatments. The results showed that the CH extracted by the CSM exhibited a high correlation with the actual CH. The monitoring model that incorporated RGB and multispectral image data with random forest regression (RFR) significantly improved the prediction results of AGB, in which the RMSEP decreased by 8.33–16.00%, R2 = 0.90, RMSEP = 0.21 kg/m2, and RRMSE = 14.05%.

Relevant studies have proven that a biomass model combined with VIs and CH can also improve the biomass estimation accuracy for corn (Niu et al., 2019), wheat (Lu et al., 2019), ryegrass (Borra-Serrano et al., 2019), and other crops. These cases prove that the combination of VIs and CH is an effective way to build a biomass model. However, Niu et al. (2019) pointed out that the fusion of CH data derived from RGB images in the VIs model, which was based on MLR, did not significantly improve the estimation of the VI model, which may be caused by the clear correlation between VIs and CH in this crop (Schirrmann et al., 2016). Therefore, it is necessary to combine biomass indices reasonably for different crops.

According to the idea of combining spectral information with image texture to build VIs, Liu et al. (2019) used a linear regression model to convert the digital number (DN) of the original image into surface reflectance. The reflectivity obtained from the gain and offset values of each band was used to calculate the VIs and image texture. The results showed that the introduction of image texture into the partial least squares regression (PLSR) and RFR models could estimate winter rape AGB more accurately than a model based on VIs alone. The accuracy of the prediction of AGB by the RFR model using VIs and texture measurements (RMSE = 274.18 kg/ha) was slightly higher than that of the PLSR model (RMSE = 284.09 kg/ha). The same idea has also obtained good results in applications to winter wheat (Yue et al., 2019), rice (Zheng et al., 2019), soybean (Maimaitijiang et al., 2020), and other crops. Biomass models combined with VIs and image texture have great potential in the estimation of crop biomass.

Data Processing and Analysis Methods

Data analysis is the key link to build the relationship between the data obtained from UAS and the actual crop biomass, and it is an important part of UAS. The data obtained from UAS often contain different noises, and the information is highly correlated. Generally, effective data analysis methods are needed to interpret the data and establish a robust prediction model (Cen et al., 2019). Therefore, scientific and systematic data analysis methods often play an important role.

Data Preprocessing Methods

Since the data collected by UAS cannot be directly used to monitor biomass, a series of preprocessing steps is needed for the data. When spectral sensors are used, an indispensable step is geometric correction and mosaicking of the image. Common software includes Pix4DMapper and Agisoft Photoscan.

Pix4DMapper software (Pix4D, S.A., Lausanne, Switzerland) is UAS photography geometric correction and mosaic technology based on feature matching and SfM photogrammetry technology. Initially, images were processed in any model space to create three-dimensional point clouds. The point clouds could be transformed into a real-world coordinate system using either direct geolocation techniques to estimate the camera’s location or GCP techniques for automatic identification within the point cloud. The point cloud was then used to generate the DTM required for image correction. Subsequent geographic reference images are linked together to form a mosaic of the study area (Turner et al., 2012).

Agisoft Photoscan software (Agisoft LLC, St. Petersburg, Russia) is also a common UAS data preprocessing software. The processing procedure is similar to that of Pix4DMapper. Finally, the UAS image is exported to TIFF image format for subsequent analysis (Acorsi et al., 2019; Lu et al., 2019). Figure 4 shows RGB imagery datasets were processed using the software Agisoft PhotoScan (Sun et al., 2019).

FIGURE 4

  • (a)

    High-resolution proof images of the acquisition area

  • (b)

    Overall map of research area processed by Agisoft PhotoScan.

Data Analysis Methods

Machine learning algorithms are widely used to process biomass information. According to whether the input dataset is labeled, machine learning algorithms can be divided into supervised learning algorithms and unsupervised learning algorithms (Dike et al., 2018). Supervised learning algorithms depend on a labeled dataset. Classification algorithms and regression algorithms are the two output forms of supervised learning. In crop biomass monitoring, regression algorithms are more often used than classification algorithms. Because the expected biomass results are often continuous instead of discrete. Unsupervised learning does not rely on a labeled dataset. It is often used when the cost of labeled datasets is unacceptable (Sapkal et al., 2007). This is also a common method to reduce the dimensionality of the data. Most unsupervised learning algorithms are in the form of cluster analysis. Figure 5 shows the types of machine learning algorithms.

FIGURE 5

Biomass monitoring is a typical regression problem, which can to be solved by supervised learning algorithms. Enough labeled datasets are the basis of supervised learning algorithms. Actual biomass data tend to obtain through destructive samplings (Jiang et al., 2019; Yue et al., 2019). Field trials are limited by the area of cropland and the crop growing season. Therefore, sufficiently large datasets are often not available. How to properly divide the datasets into training datasets and validation datasets is a challenge to train supervised learning algorithms. To solve this problem, Jiang et al. (2019) used fivefold cross validation, Han et al. (2019) used repeated 10-fold cross validation, Zhu W.X. et al. (2019) used leave-one-out-cross validation (LOOCV) to reduce generalization error. Fivefold cross validation and repeated 10-fold cross validation belong to k-fold cross validation. LOOCV is a special case of k-fold cross validation, in which the number of folds equals the number of instances (Wong, 2015). k-fold cross validation divides the datasets into k folds, treats each fold as a validation dataset and regards the other k−1 folds as a training dataset (Wong and Yang, 2017). The value of folds can be large and the value of replications should be small if k-fold cross validation is applied in the classification algorithms (Wong and Yeh, 2020).

Support Vector Regression

Support vector regression (SVR) is a boundary detection algorithm for identifying/defining multidimensional boundaries (Sharma et al., 2020), and the basis of this method is to solve the regression problem by using appropriate kernel functions to map the training data to the new hyperspace characteristics and transform the multidimensional regression problem into a linear regression problem (Navarro et al., 2019). Duan et al. (2019) in an analysis of VIs and image texture using SVR found that the SVR itself has the ability to find a suitable combination of different reflectance bands, which shown that SVR has strong adaptability to complex data and is suitable for data analysis in biomass monitoring. Yang et al. (2019) compared PLSR and SVR, and the results showed that the accuracy of SVR was higher than that of PLSR, and the SVR optimized by particle swarm optimization (PSO) could obtain more appropriate parameters and improve the accuracy of the model.

Random Forest Regression

Random forest regression is a data analysis and statistical method that is widely used in machine learning and remote sensing research (Viljanen et al., 2018). Compared with artificial neural networks (ANNs), RFR does not suffer from overfitting problems because of the law of large numbers, and the injection of suitable randomness makes them precise regressors (Breiman, 2001). The random forest algorithm makes full use of all input data and can tolerate outliers and noise (Jiang et al., 2019). This algorithm has the advantages of high prediction accuracy, no need for feature selection and insensitivity to overfitting (Tewes and Schellberg, 2018; Viljanen et al., 2018).

Artificial Neural Network

An ANN is an information processing paradigm that is inspired by the way biological nervous systems such as the brain process information (Awodele and Jegede, 2013). ANN is a nonparametric nonlinear model that uses a neural network to transmit between layers and simulates reception and processing of information by the human brain (Zha et al., 2020). In individual cases, the results of the algorithm are not better than those of the multiple linear regression (MLR) method. The reason for this difference may be that in these applications, a small sample set will not meet the needs of the artificial neural network (Han et al., 2019; Zhu W.X. et al., 2019; Zha et al., 2020), and compared with RFR, ANN needs large data sets and a large number of repetitions to generate appropriate nonlinear mapping and obtain the optimal neural network (Devia et al., 2019; Han et al., 2019); however, RFR can still be applied for a small amount of sample data (Han et al., 2019; Liu et al., 2019), which leads to more frequent biomass monitoring use of RFR. Therefore, before the development of a deep neural network (DNN), the remote sensing field, including UAS studies, shifted the focus of data analysis methods from ANN to SVR and RFR (Ma et al., 2019).

The appearance of DNN and a series of methods to solve overfitting improved the effect of ANN (Maimaitijiang et al., 2020). Nevavuori et al. (2019) used a convolutional neural network (CNN) to predict the biomass of wheat and barley. The researchers tested the influence of the selection of the training algorithm, the depth of the network, the regularization strategy, the adjustment of super parameters, and other aspects of CNN on the prediction efficiency to improve the monitoring effect. This study proved that if enough information can be collected to increase the number of samples and solve the overfitting problem, ANN will perform no worse than RFR (Zhang et al., 2019).

Multiple Regression Techniques

Multiple linear regression (Borra-Serrano et al., 2019; Devia et al., 2019; Han et al., 2019; Zhu W.X. et al., 2019), SMLR (Lu et al., 2019; Zheng et al., 2019) and PLSR (Borra-Serrano et al., 2019; Liu et al., 2019; Yue et al., 2019) are also commonly used multiple regression algorithms. However, with the gradual progress of SVR, RFR, and ANN, these algorithms have gradually become references for SVR, RFR, and ANN and are no longer the main focus of data analysis.

Devia et al. (2019) described an MLR equation for monitoring rice biomass with VIs. In general, there was a linear relationship between the accumulation of biomass and VIs. However, the relationship between biomass and VIs in other crops at maturity can be nonlinear. Therefore, MLR does not apply to these nonlinear relations of crops. Zheng et al. (2019) used SMLR to establish the relationship between rice biomass and remote sensing variables (VIs, image texture, and the combination of VIs and image texture). Although the estimation accuracy was high, the model was complex and difficult to generalize. Moeckel et al. (2018) tested the ability of PLSR, RFR, and SVR to predict the CH of eggplant, tomato and cabbage, and the results showed that PLSR did not exceed the performance of RFR and SVR, so it was excluded first.

The monitoring of biomass is a typical nonlinear problem (Zha et al., 2020). These regression techniques are more suitable for data showing linear or exponential relationships between remote sensing variables and crop parameters (Atzberger et al., 2010; Jibo et al., 2018; Lu et al., 2019). These methods are often not as good as SVR, RFR, and ANN in the monitoring of biomass.

The construction of a high-performance monitoring model based on advanced algorithms (such as machine learning algorithms) is a good method to improve the effect of crop biomass monitoring (Niu et al., 2019). The monitoring of biomass is a typical multi-feature nonlinear problem (Zha et al., 2020), and machine learning algorithms (such as SVR, RFR, and ANN) exhibit superior results in solving these types of problems (Breiman, 2001; Navarro et al., 2019; Maimaitijiang et al., 2020). During the study, by comparing with RFR, the researchers found that ANN was often superior to RFR when dealing with large sample sizes and complex, nonlinear, and redundant data sets (LeCun et al., 2015; Schmidhuber, 2015; Kang and Kang, 2017; Zhang et al., 2018; Maimaitijiang et al., 2020). However, in a small sample size, the lack of samples often leads to the phenomenon of overfitting, and RFR will achieve better results than ANN due to its stronger robustness and generalization ability (Zhang and Li, 2014; Yao et al., 2015; Yuan et al., 2017; Yue et al., 2017; Zheng et al., 2018; Zhu W.X. et al., 2019; Zha et al., 2020).

The Promotion of Large-Scale UAS Applications

Accuracy is an important indicator to evaluate the effects of UAS in the field of crop biomass estimation. In addition, reducing the cost to promote the large-scale application of UAS in this field is a difficult problem. From the perspective of improving the accuracy of crop biomass estimations, multisensor data fusion, multi-index fusion, the consideration of a variety of features not directly related to the monitoring of biomass, and the use of advanced algorithms are feasible directions (Maimaitijiang et al., 2020). Considering the promotion of large-scale applications, the use of low-cost sensors and the combination of suitable models and algorithms to improve the estimation accuracy of low-cost sensors, rather than the use of more expensive sensors, is an effective research path to promote the large-scale application of UAS in the field of crop biomass monitoring (Acorsi et al., 2019; Lu et al., 2019; Niu et al., 2019; Yue et al., 2019). RGB sensors are currently the most widely used low-cost sensor (Lussem et al., 2019), and studies based on RGB sensors are expected to promote the large-scale application of UAS in the field of crop biomass monitoring.

RGB sensors are not capable of providing NIR band data. Therefore, VIs associated with NIR bands cannot be used, which inhibits the enhancement of vegetation vitality contrast (Lu et al., 2019) and may affect the accuracy of biomass estimation. Lu et al. (2019) used a combination of advanced algorithms and multi-index fusion to compensate for this deficiency. Yue et al. (2019) fused the image texture and VIs to obtain the most accurate estimated value of AGB (R2 = 0.89, MAE = 0.67 t/ha, RMSE = 0.82 t/ha). These study proves that the use of low-cost sensors can guarantee the accuracy of biomass estimation and is expected to promote large-scale applications.

Solar elevation angle (Jiang et al., 2019), meteorological conditions (Devia et al., 2019; Wang F. et al., 2019), rainfall (Liu et al., 2019; Rose and Kage, 2019), soil characteristics (Acorsi et al., 2019; Vogel et al., 2019), the spatial distribution of multiple plants in a block (Han et al., 2019), and other characteristics not directly related to biomass monitoring also affect the accuracy of biomass estimations. The monitoring accuracy of low-cost sensors can be improved by considering the characteristics that are not directly related to biomass monitoring.

Jiang et al. (2019) calculated the solar elevation angle to avoid sun glint and hotspot effects. In addition, growing degree days (GDD) was incorporated into the model to estimate rice AGB as a meteorological feature. Models incorporating meteorological features achieved better estimation accuracy (R2 = 0.86, RMSE = 178.37 g/m2, MAE = 127.34 g/m2) than models that did not use these features (R2 = 0.64, RMSE = 286.79 g/m2, MAE = 236.49 g/m2).

Other studies have also demonstrated the importance of considering features that are not directly related to monitoring biomass. Borra-Serrano et al. (2019) also took GDD as a meteorological feature and obtained the best estimate by combining CH, VIs and meteorological data variables in an MLR model (R2 = 0.81) to monitor ryegrass dry-matter biomass. Devia et al. (2019) also mentioned the influence of solar elevation angle and indicated that weather conditions (sunny and cloudy) can affect the quality of the data, especially in lowland crops where moisture reflection changes the appearance of the image. The above studies showed that the accuracy of biomass estimation can be improved by considering meteorological characteristics and solar elevation angle. More sample points must be obtained from multiple research sites and under different environmental conditions in future studies to train a more robust multivariate model (Liu et al., 2019). Summary of relevant studies are shown in Table 2.

TABLE 2

CropPlatformsSensorsBiomass indicesData analysis methodsResultsReferences
WheatDJI Phantom seriesA digital cameraVIs, CHRFRR2 = 0.78, RMSE = 1.34 t/ha, RRMSE = 28.98%Lu et al., 2019
RiceDJI S1000 DJI Phantom 4 ProMini-MCA 12 multispectral camera DJI FC6310 digital cameraVIs, CH Meteorological featureSERR2 = 0.86, RMSE = 178.37 g/m2, MAE = 127.34 g/m2Jiang et al., 2019
Potato Sugar beet Winter wheatRIEGL RiCOPTERVUX-SYS laser scannerCHMLRPotato: R2 = 0.24, RMSE = 22.09 g/m2 Sugar beet: R2 = 0.68, RMSE = 17.47 g/m2 Winter wheat: R2 = 0.82, RMSE = 13.94 g/m2ten Harkel et al., 2020
MaizeDJI Phantom 2Ricoh GR digital cameraCHStatistical analysisThe estimated values were most accurate when using a fisheye lens at 30 m altitude.Calou et al., 2019
Winter wheatDJI S1000DSC-QX100 digital cameraVIsSMLRR2 = 0.89, MAE = 0.67 t/ha, RMSE = 0.82 t/haYue et al., 2019
RiceA lightweight octorotor UAVAn RGB camera A multispectral cameraVIs, CHRFRR2 = 0.90, RMSEP = 0.21 kg/m2, RRMSE = 14.05%Cen et al., 2019
Winter wheatDJI S1000DSC–QX100 digital camera UHD 185 Firefly snapshot hyperspectral sensorVIsExponential regressionR2 = 0.67, MAE = 1.19, RMSE = 1.71Yue et al., 2018
Winter wheatDJI S1000UHD 185-FireflyVIsPLSRThe results of AGB monitoring can be improved by combining the red-edge parameters with VIs.Tao et al., 2020
Corn WheatDJI M600 ProMini-MCA 6 multispectral cameraVIsLinear regressionA systematical radiometric calibration method was proposed.Guo et al., 2019
RiceMikrokopter OktoXLTetracam mini-MCA6 multispectral cameraVIsSMLRR2 = 0.78, RMSE = 1.84 t/haZheng et al., 2019
Winter oilseed rapeDJI S1000Mini-MCA multispectral cameraVIsPLSR RFRRFR: RMSE = 274.18 kg/ha PLSR: RMSE = 284.09 kg/haLiu et al., 2019
MaizeDJI Phantom 4 Pro DJI M600 ProParrot Sequoia multispectral camera DJI FC6310 digital camera RIEGL VUX-1UAV laser scannerVIs, CHMLR PLSRMLR: R2 = 0.82, RMSE = 79.80 g/m2, NRMSE = 11.12% PLSR: R2 = 0.86, RMSE = 72.28 g/m2, NRMSE = 10.07%Zhu Y.H. et al., 2019
SoybeanDJI S1000Mapir Survey2 RGB camera Parrot Sequoia multispectral camera FLIR Vue Pro R 640 thermal imagerVIs, CHDNN-F2R2 = 0.720, RMSE = 478.9 kg/ha, RRMSE = 15.9%Maimaitijiang et al., 2020
TomatoDJI Matrice 100A RGB Zenmuse X3 sensorVIsRFRR2 = 0.85, RMSE = 0.052 mJohansen et al., 2019
RyegrassOnyxstar HYDRA-12RGB cameraVIs, CH Meteorological featureMLR RFRMLR: R2 = 0.81, RMSE = 679 kg/ha, NRMSE = 21.3% RFR: R2 = 0.70, RMSE = 769 kg/ha, NRMSE = 24.2%Borra-Serrano et al., 2019
Wheat BarleyAirinov Solo 3DR UAVParrot’s NIR-capable SEQUIOA-sensorNoneCNNMAE = 484.3 kg/ha, MAPE = 8.8%Nevavuori et al., 2019
Ten winter wheat cultivarsEbee fixed-wing UAVCanon Powershot S110 RGB camera Canon Powershot S110 NIR cameraVIsCluster analysisCombination of multiple VIs can be a valid strategy.Marino and Alvino, 2020
Coastal meadowsEbee fixed-wing UAVParrot Sequoia multispectral cameraVIsRFRCombination of multiple VIs can be a valid strategy.Villoslada et al., 2020
MaizeDJI S1000DSC-QX100 digital camera Parrot Sequoia multispectral cameraBIOVP (VIs, CH)RFRR2 = 0.944, RMSE = 0.495, MAE = 0.355Han et al., 2019
Bread wheatDJI Inspires 1 model T600Sequoia 4.0 multispectral cameraCHLinear regressionR2 = 0.96Hassan et al., 2019
MaizeEWZ-D6 six-rotator UAV DJI M100 four-rotator UAV Ebee fixed-wing UAVMultiSPEC-4C multispectral camera MicaSense RedEdge-M multispectral camera Alpha Series AL3-32 LiDAR sensorCHRFRR2 = 0.90, RMSE = 2.29, MRE = 0.22Zhu W.X. et al., 2019
RiceAn UAV equipped with a Mini-MCA systemAn array of 12 individual miniature digital camerasVIsSVRSVR itself has the ability to find a suitable combination of different reflectance bands.Duan et al., 2019
Winter wheatFour-axis aerial vehicle UAV 3PSony EXMOR HD cameraVIsSVRR2 = 0.9025, RMSE = 0.3287Yang et al., 2019
RiceUAVTetracam ADC-lite multispectral cameraVIsMLRR2 = 0.76Devia et al., 2019
Eggplant Tomato CabbageDJI 3 ProDJI FC300X RGB cameraCHSVR RFRR2 ranging from 0.87 to 0.97 Bias ranging from −0.66 to 0.45 cmMoeckel et al., 2018
SorghumCustom designed UAV platformsSony Alpha ILCE-7R Velodyne VLP-16 Two Headwall Photonics push-broom scannersFour hyperspectral-based features and four LiDAR-based featuresPLSR SVR RFRThe data source was more important than the regression method.Masjedi et al., 2020
RiceUAV platformTetracam ADC-lite multispectral cameraVIsMultivariable regressionAn average correlation of 0.76Devia et al., 2019

Summarize the equipment, methods, and important results of the studies cited in the body.

This table covers plentiful case-studies from different regions for different crops.

Conclusion and Future Perspectives

As a high precision, high flexibility and nondestructive remote sensing system, UAS have been widely used to monitor crop biomass. The application of UAS in the monitoring of crop biomass in recent years was reviewed in this article. Four kinds of data acquisition equipment (LiDAR, RGB sensor, multispectral sensor, and hyperspectral sensor), two biomass indices (VIs and CH) and three data analysis methods (SVR, RFR, and ANN) were introduced.

Despite the rapid progress in this area, difficulties remain. First, we need to improve the speed of data acquisition and processing. Although multisensor data fusion improves the accuracy of evaluation, it makes the process of data collection more complex, data sorting more difficult, and objectively reduces the speed of monitoring. In addition, although advanced algorithms improve the evaluation accuracy, they require a long training time. Second, there is no universal method that can be applied to all crops in all cases. Different crops, even the same crops in different environments, have different characteristics. This difference requires us to carefully distinguish the characteristics of crops, use appropriate sensors to collect characteristics, and test multiple indices to determine the best biomass indices. Third, the high cost of equipment hinders the large-scale use of UAS in crop biomass monitoring. Although research on low-cost sensors has appeared, the method that is needed to improve the estimation accuracy when using low-cost sensors still needs further research. It is predicted that adopting multi-index fusion, considering features not directly related to monitoring biomass, and the adoption of advanced algorithms can effectively improve the monitoring effect of low-cost sensors on crop biomass, which is the future development direction.

Because of its high precision, flexibility and nondestructive nature, UAS have the potential to become an important method for the monitoring of crop biomass. Crop biomass monitoring systems based on multisensor fusion and multi-index fusion, the consideration of features that are not directly related to biomass monitoring and the adoption of advanced algorithms are effective methods and development directions to improve the accuracy of crop biomass estimation by UAS. Because of their low cost, using RGB sensors have become an effective method to promote the large-scale application of UAS in the field of crop biomass monitoring. In the field of biomass monitoring, UAS still have great attraction, and there are an increasing number of studies on the monitoring of crop biomass based on UAS. Furthermore, it is expected that the challenges of UAS promotion will be overcome in the future, which is conducive to the realization of smart agriculture and precision agriculture.

Statements

Author contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Funding

The work in this article was supported by the Key Research and Development Program of Nanning (20192065), the National Natural Science Foundation for Young Scientists of China (31801804), the projects subsidized by special funds for Science Technology Innovation and Industrial Development of Shenzhen Dapeng New District (Grant No. PT202001-06), and the National Innovation and Entrepreneurship Training Plan for College Students (201910593010).

Acknowledgments

The authors thank native English speaking expert from the editing team of American Journal Experts for polishing our article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  • 1

    AcorsiM. G.MirandaF. D. A.MartelloM.SmaniottoD. A.SartorL. R. (2019). Estimating biomass of black oat using UAV-based RGB imaging.Agronomy Basel9:14. 10.3390/agronomy9070344

  • 2

    AlheitK. V.BusemeyerL.LiuW.MaurerH. P.GowdaM.HahnV.et al (2014). Multiple-line cross QTL mapping for biomass yield and plant height in triticale (x Triticosecale Wittmack).Theor. Appl. Genet.127251260. 10.1007/s00122-013-2214-6

  • 3

    AtzbergerC.GuerifM.FredericB.WernerW. (2010). Comparative analysis of three chemometric techniques for the spectroradiometric assessment of canopy chlorophyll content in winter wheat.Comput. Electron. Agric.73165173. 10.1016/j.compag.2010.05.006

  • 4

    AwodeleO.JegedeO. (2013). “Neural networks and its application in engineering,” inProceedings of the Insite: Informing Science + It Education Conference (Macon, USA).

  • 5

    BendigJ.YuK.AasenH.BoltenA.BennertzS.BroscheitJ.et al (2015). Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley.Int. J. Appl. Earth Observ. Geoinform.397987. 10.1016/j.jag.2015.02.012

  • 6

    Borra-SerranoI.De SwaefT.MuylleH.NuyttensD.VangeyteJ.MertensK.et al (2019). Canopy height measurements and non-destructive biomass estimation of Lolium perenne swards using UAV imagery.Grass Forage Sci.74356369. 10.1111/gfs.12439

  • 7

    BoschettiM.BocchiS.BrivioP. A. (2007). Assessment of pasture production in the Italian Alps using spectrometric and remote sensing information.Agric. Ecosyst. Environ.118267272. 10.1016/j.agee.2006.05.024

  • 8

    BredeB.LauA.BartholomeusH. M.KooistraL. (2017). Comparing RIEGL RiCOPTER UAV LiDAR derived canopy height and DBH with terrestrial LiDAR.Sensors17:16. 10.3390/s17102371

  • 9

    BreimanL. (2001). Random forests.Mach. Learn.45532. 10.1023/A:1010933404324

  • 10

    CaldersK.NewnhamG.BurtA.MurphyS.RaumonenP.HeroldM.et al (2015). Nondestructive estimates of above-ground biomass using terrestrial laser scanning.Methods Ecol. Evol.6198208. 10.1111/2041-210x.12301

  • 11

    CalouV. B. C.TeixeiraA. D.MoreiraL. C. J.NetoO. C. D.da SilvaJ. A. (2019). ESTIMATION OF MAIZE BIOMASS USING UNMANNED AERIAL VEHICLES.Engenharia Agricola39744752. 10.1590/1809-4430-Eng.Agric.v39n6p744-752/2019

  • 12

    CenH. Y.WanL.ZhuJ. P.LiY. J.LiX. R.ZhuY. M.et al (2019). Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras.Plant Methods15:16. 10.1186/s13007-019-0418-8

  • 13

    ChaoZ.LiuN.ZhangP.YingT.SongK. (2019). Estimation methods developing with remote sensing information for energy crop biomass: a comparative review.Biomass Bioenergy122414425. 10.1016/j.biombioe.2019.02.002

  • 14

    CostaL.NunesL.AmpatzidisY. (2020). A new visible band index (vNDVI) for estimating NDVI values on RGB images utilizing genetic algorithms.Comput. Electron. Agric.172:13. 10.1016/j.compag.2020.105334

  • 15

    DaughtryC. S. T.WalthallC. L.KimM. S.de ColstounE. B.McMurtreyJ. E. (2000). Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance.Remote Sens. Environ.74229239. 10.1016/S0034-4257(00)00113-9

  • 16

    DeviaC. A.RojasJ. P.PetroE.MartinezC.MondragonI. F.PatinoD.et al (2019). High-throughput biomass estimation in rice crops using UAV multispectral imagery.J. Intell. Robot. Syst.96573589. 10.1007/s10846-019-01001-5

  • 17

    DikeH. U.ZhouY.DeveerasettyK. K.WuQ. (2018). “Unsupervised learning based on artificial neural Network: a review,” inProceedings of the 2018 IEEE International Conference on Cyborg and Bionic Systems (CBS), (Shenzhen: IEEE), 322327.

  • 18

    DomingoD.OrkaH. O.NaessetE.KachambaD.GobakkenT. (2019). Effects of UAV image resolution, camera type, and image overlap on accuracy of biomass predictions in a tropical Woodland.Remote Sens.11:17. 10.3390/rs11080948

  • 19

    DuanB.LiuY. T.GongY.PengY.WuX. T.ZhuR. S.et al (2019). Remote estimation of rice LAI based on Fourier spectrum texture from UAV image.Plant Methods15:12. 10.1186/s13007-019-0507-8

  • 20

    FengL.YueD.GuoX. (2009). A review on application of normal different vegetation index.For. Inventory Plan.344852. 10.3969/j.issn.1671-3168.2009.02.013

  • 21

    FéretJ. B.GitelsonA. A.NobleS. D.JacquemoudS. (2017). PROSPECT-D: towards modeling leaf optical properties through a complete lifecycle.Remote Sens. Environ.193204215. 10.1016/j.rse.2017.03.004

  • 22

    GitelsonA.MerzlyakM. N. (1997). Remote estimation of chlorophyll content in higher plant leaves.Int. J. Remote Sens.1826912697. 10.1080/014311697217558

  • 23

    GitelsonA.ViñaA.CigandaV.RundquistD.ArkebauerT. (2005). Remote estimation of canopy chlorophyll in crops.Geophys. Res. Lett.32:L08403. 10.1029/2005GL022688

  • 24

    GitelsonA. A. (2004). Wide dynamic range vegetation index for remote quantification of biophysical characteristics of vegetation.J. Plant Physiol.161165173. 10.1078/0176-1617-01176

  • 25

    GitelsonA. A.Gritz †Y.MerzlyakM. N. (2003). Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves.J. Plant Physiol.160271282. 10.1078/0176-1617-00887

  • 26

    GnypM. L.BarethG.LiF.Lenz-WiedemannV. I. S.KoppeW.MiaoY.et al (2014). Development and implementation of a multiscale biomass model using hyperspectral vegetation indices for winter wheat in the North China Plain.Int. J. Appl. Earth Observ. Geoinform.33232242. 10.1016/j.jag.2014.05.006

  • 27

    GuoY.SenthilnathJ.WuW.ZhangX.ZengZ.HuangH. (2019). Radiometric calibration for multispectral camera of different imaging conditions mounted on a UAV platform.Sustainability11124. 10.3390/su11040978

  • 28

    HaboudaneD.MillerJ. R.TremblayN.Zarco-TejadaP. J.DextrazeL. (2002). Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture.Remote Sens. Environ.81416426. 10.1016/S0034-4257(02)00018-4

  • 29

    HaklJ.HrevušováZ.HejcmanM.FuksaP. (2012). The use of a rising plate meter to evaluate lucerne (Medicago sativa L.) height as an important agronomic trait enabling yield estimation.Grass Forage Sci.67589596. 10.1111/j.1365-2494.2012.00886.x

  • 30

    HanL.YangG. J.DaiH. Y.XuB.YangH.FengH. K.et al (2019). Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data.Plant Methods15:19. 10.1186/s13007-019-0394-z

  • 31

    HaralickR. M.ShanmugamK.DinsteinI. (1973). Textural features for image classification.IEEE Trans. Syst. Man Cybern.SMC-3, 610621. 10.1109/TSMC.1973.4309314

  • 32

    HassanM. A.YangM.FuL.RasheedA.ZhengB.XiaX.et al (2019). Accuracy assessment of plant height using an unmanned aerial vehicle for quantitative genomic analysis in bread wheat.Plant Methods15:37. 10.1186/s13007-019-0419-7

  • 33

    HassanM. A.YangM.RasheedA.JinX.XiaX.XiaoY.et al (2018). Time-series multispectral indices from unmanned aerial vehicle imagery reveal senescence rate in bread wheat.Remote Sens.10:809. 10.3390/rs10060809

  • 34

    HasslerS. C.Baysal-GurelF. (2019). Unmanned aircraft system (UAS) technology and applications in agriculture.Agronomy Basel9:21. 10.3390/agronomy9100618

  • 35

    HoganS.KellyM.StarkB.ChenY. (2017). Unmanned aerial systems for agriculture and natural resources.Calif. Agric.71514. 10.3733/ca.2017a0002

  • 36

    HueteA.DidanK.MiuraT.RodriguezE. P.GaoX.FerreiraL. G. (2002). Overview of the radiometric and biophysical performance of the MODIS vegetation indices.Remote Sens. Environ.83195213. 10.1016/S0034-4257(02)00096-2

  • 37

    HueteA. R. (1988). A soil-adjusted vegetation index (SAVI).Remote Sens. Environ.25295309. 10.1016/0034-4257(88)90106-X

  • 38

    JeziorskaJ. (2019). UAS for wetland mapping and hydrological modeling.Remote Sens.11:39. 10.3390/rs11171997

  • 39

    JiangQ.FangS. H.PengY.GongY.ZhuR. S.WuX. T.et al (2019). UAV-based biomass estimation for rice-combining spectral, TIN-based structural and meteorological features.Remote Sens.11:19. 10.3390/rs11070890

  • 40

    JiangZ.HueteA. R.DidanK.MiuraT. (2008). Development of a two-band enhanced vegetation index without a blue band.Remote Sens. Environ.11238333845. 10.1016/j.rse.2008.06.006

  • 41

    JiboY.FengH.GuijunY.LiZ. (2018). A comparison of regression techniques for estimation of above-ground winter wheat biomass using near-surface spectroscopy.Remote Sens.10:66. 10.3390/rs10010066

  • 42

    JohansenK.MortonM. J. L.MalbeteauY.AragonB.Al-MashharawiS.ZilianiM.et al (2019). “Predicting biomass and yield at harvest of salt-stressed tomato plants using uav imagery,” inProceedings of the 4th ISPRS Geospatial Week 2019, June 10, 2019 – June 14, 2019, Vol. 42 (Enschede: International Society for Photogrammetry and Remote Sensing), 407411.

  • 43

    KangH. W.KangH. B. (2017). Prediction of crime occurrence from multi-modal data using deep learning.PLoS One12:19. 10.1371/journal.pone.0176244

  • 44

    KellyJ.KljunN.OlssonP.-O.MihaiL.LiljebladB.WeslienP.et al (2019). Challenges and best practices for deriving temperature data from an uncalibrated UAV thermal infrared camera.Remote Sens.11:7098. 10.3390/rs11050567

  • 45

    KimJ.KimS.JuC.SonH. I. (2019). Unmanned aerial vehicles in agriculture: a review of perspective of platform, control, and applications.IEEE Access7105100105115. 10.1109/access.2019.2932119

  • 46

    LeCunY.BengioY.HintonG. (2015). Deep learning.Nature521436444. 10.1038/nature14539

  • 47

    LiS. Y.YuanF.Ata-Ui-KarimS. T.ZhengH. B.ChengT.LiuX. J.et al (2019). Combining color indices and textures of UAV-based digital imagery for rice LAI estimation.Remote Sens.11:21. 10.3390/rs11151763

  • 48

    LiW.NiuZ.HuangN.WangC.GaoS.WuC. Y. (2015). Airborne LiDAR technique for estimating biomass components of maize: a case study in Zhangye city, Northwest China.Ecol. Indic.57486496. 10.1016/j.ecolind.2015.04.016

  • 49

    LiZ.HuD.ZhaoD.XiangD. (2015). Research advance of broadband vegetation index using remotely sensed images.J. Yangtze River Sci. Res. Inst.32125130. 10.3969/j.issn.1001-5485.2015.01.026

  • 50

    LiuY. N.LiuS. S.LiJ.GuoX. Y.WangS. Q.LuJ. W. (2019). Estimating biomass of winter oilseed rape using vegetation indices and texture metrics derived from UAV multispectral images.Comput. Electron. Agric.166:10. 10.1016/j.compag.2019.105026

  • 51

    LohaniB.GhoshS. (2017). Airborne LiDAR technology: a review of data collection and processing systems.Proc. Natl. Acad. Sci. India Sec. A Phys. Sci.87567579. 10.1007/s40010-017-0435-9

  • 52

    LuN.ZhouJ.HanZ. X.LiD.CaoQ.YaoX.et al (2019). Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system.Plant Methods15:16. 10.1186/s13007-019-0402-3

  • 53

    LussemU.BoltenA.MenneJ.GnypM. L.SchellbergJ.BarethG. (2019). Estimating biomass in temperate grassland with high resolution canopy surface models from UAV-based RGB images and vegetation indices.J. Appl. Remote Sens.13:034525. 10.1117/1.Jrs.13.034525

  • 54

    MaL.LiuY.ZhangX. L.YeY. X.YinG. F.JohnsonB. A. (2019). Deep learning in remote sensing applications: a meta-analysis and review.ISPRS J. Photogramm. Remote Sens.152166177. 10.1016/j.isprsjprs.2019.04.015

  • 55

    MaesW. H.SteppeK. (2019). Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture.Trends Plant Sci.24152164. 10.1016/j.tplants.2018.11.007

  • 56

    MaimaitijiangM.SaganV.SidikeP.HartlingS.EspositoF.FritschiF. B. (2020). Soybean yield prediction from UAV using multimodal data fusion and deep learning.Remote Sens. Environ.237:20. 10.1016/j.rse.2019.111599

  • 57

    MarinoS.AlvinoA. (2020). Agronomic traits analysis of ten winter wheat cultivars clustered by UAV-derived vegetation indices.Remote Sens.12:16. 10.3390/rs12020249

  • 58

    MasjediA.CrawfordM. M.CarpenterN. R.TuinstraM. R. (2020). Multi-temporal predictive modelling of sorghum biomass using UAV-based hyperspectral and lidar data.Remote Sens.12:3587. 10.3390/rs12213587

  • 59

    MoeckelT.DayanandaS.NidamanuriR. R.NautiyalS.HanumaiahN.BuerkertA.et al (2018). Estimation of vegetable crop parameter by multi-temporal UAV-borne images.Remote Sens.10:18. 10.3390/rs10050805

  • 60

    MontesJ. M.TechnowF.DhillonB. S.MauchF.MelchingerA. E. (2011). High-throughput non-destructive biomass determination during early plant development in maize under field conditions.Field Crops Res.121268273. 10.1016/j.fcr.2010.12.017

  • 61

    NavarroJ. A.AlgeetN.Fernandez-LandaA.EstebanJ.Rodriguez-NoriegaP.Guillen-ClimentM. L. (2019). Integration of UAV, sentinel-1, and sentinel-2 data for mangrove plantation aboveground biomass monitoring in senegal.Remote Sens.11:23. 10.3390/rs11010077

  • 62

    NevavuoriP.NarraN.LippingT. (2019). Crop yield prediction with deep convolutional neural networks.Comput. Electron. Agric.163:9. 10.1016/j.compag.2019.104859

  • 63

    NiuY. X.ZhangL. Y.ZhangH. H.HanW. T.PengX. S. (2019). Estimating above-ground biomass of maize using features derived from UAV-based RGB imagery.Remote Sens.11:21. 10.3390/rs11111261

  • 64

    PreyL.SchmidhalterU. (2019). Simulation of satellite reflectance data using high-frequency ground based hyperspectral canopy measurements for in-season estimation of grain yield and grain nitrogen status in winter wheat.ISPRS J. Photogram. Remote Sens.149176187. 10.1016/j.isprsjprs.2019.01.023

  • 65

    ProstL.JeuffroyM.-H. (2007). Replacing the nitrogen nutrition index by the chlorophyll meter to assess wheat N status.Agron. Sustain. Dev.27321330. 10.1051/agro:2007032

  • 66

    QiuP. H.WangD. Z.ZouX. Q.YangX.XieG. Z.XuS. J.et al (2019). Finer resolution estimation and mapping of mangrove biomass using UAV lidar and worldview-2 data.Forests10:21. 10.3390/f10100871

  • 67

    Ramon SauraJ.Reyes-MenendezA.Palos-SanchezP. (2019). Mapping multispectral digital images using a cloud computing software: applications from UAV images.Heliyon5:e01277. 10.1016/j.heliyon.2019.e01277

  • 68

    RaperT. B.VarcoJ. J. (2015). Canopy-scale wavelength and vegetative index sensitivities to cotton growth parameters and nitrogen status.Precis. Agric.166276. 10.1007/s11119-014-9383-4

  • 69

    RondeauxG.StevenM.BaretF. (1996). Optimization of soil-adjusted vegetation indices.Remote Sens. Environ.5595107. 10.1016/0034-4257(95)00186-7

  • 70

    RoseT.KageH. (2019). The contribution of functional traits to the breeding progress of central-european winter wheat under differing crop management intensities.Front. Plant Sci.10:1521. 10.3389/fpls.2019.01521

  • 71

    RouseJ.HaasR.SchellJ.DeeringD. (1974). Monitoring vegetation systems in the great plains with ERTS.NASA Special Publication351:309.

  • 72

    Salas FernandezM. G.BecraftP. W.YinY.LubberstedtT. (2009). From dwarves to giants? Plant height manipulation for biomass yield.Trends Plant Sci.14454461. 10.1016/j.tplants.2009.06.005

  • 73

    SapkalS. D.KakarwalS. N.RevankarP. S. (2007). “Analysis of classification by supervised and unsupervised learning,” inProceedings of the International Conference on Computational Intelligence and Multimedia Applications (ICCIMA 2007), Vol. 1 (Sivakasi: IEEE), 280284.

  • 74

    SchirrmannM.GiebelA.GleinigerF.PflanzM.LentschkeJ.DammerK.-H. (2016). Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery.Remote Sens.8:706. 10.3390/rs8090706

  • 75

    SchmidhuberJ. (2015). Deep learning in neural networks: an overview.Neural Networks6185117. 10.1016/j.neunet.2014.09.003

  • 76

    ScotfordI. M.MillerP. C. H. (2004). Combination of spectral reflectance and ultrasonic sensing to monitor the growth of winter wheat.Biosyst. Eng.872738. 10.1016/j.biosystemseng.2003.09.009

  • 77

    SelbeckJ.DworakV.EhlertD. (2010). Testing a vehicle-based scanning lidar sensor for crop detection.Can. J. Remote Sens.362435. 10.5589/m10-022

  • 78

    SharmaR.KambleS. S.GunasekaranA.KumarV.KumarA. (2020). A systematic literature review on machine learning applications for sustainable agriculture supply chain performance.Comput. Oper. Res.119:17. 10.1016/j.cor.2020.104926

  • 79

    ShentuY. C.WuC. F.WuC. P.GuoY. L.ZhangF.YangP.et al (2018). “Improvement of underwater color discriminative ability by multispectral imaging,” inProceedings of the OCEANS 2018 MTS, (Charleston, SC: IEEE).

  • 80

    ShuqinT.YuejuX.YunL.NingH.XiaoZ. (2016). Review on RGB-D image classification.Laser Optoelectron. Progr.532942. 10.3788/lop53.060003

  • 81

    SofoniaJ.ShendrykY.PhinnS.RoelfsemaC.KendoulF.SkocajD. (2019). Monitoring sugarcane growth response to varying nitrogen application rates: a comparison of UAV SLAM LIDAR and photogrammetry.Int. J. Appl. Earth Observ. Geoinform.82:15. 10.1016/j.jag.2019.05.011

  • 82

    SongB.ParkK. (2020). detection of aquatic plants using multispectral UAV imagery and vegetation index.Remote Sens.12:16. 10.3390/rs12030387

  • 83

    SongP.ZhengX.LiY.ZhangK.HuangJ.LiH.et al (2020). Estimating reed loss caused by Locusta migratoria manilensis using UAV-based hyperspectral data.Sci. Total Environ.719:137519. 10.1016/j.scitotenv.2020.137519

  • 84

    SunZ.JingW.QiaoX.YangL. (2019). Identification and monitoring of blooming mikania micrantha outbreak points based on UAV remote sensing.Trop. Geogr.39482491.

  • 85

    TaitL.BindJ.Charan-DixonH.HawesI.PirkerJ.SchielD. (2019). Unmanned aerial vehicles (UAVs) for monitoring macroalgal biodiversity: comparison of RGB and multispectral imaging sensors for biodiversity assessments.Remote Sens.11:18. 10.3390/rs11192332

  • 86

    TaoH.FengH.XuL.MiaoM.LongH.YueJ.et al (2020). Estimation of crop growth parameters using UAV based hyperspectral remote sensing data.Sensors20:1296. 10.3390/s20051296

  • 87

    ten HarkelJ.BartholomeusH.KooistraL. (2020). Biomass and crop height estimation of different crops using uav-based lidar.Remote Sens.12:18. 10.3390/rs12010017

  • 88

    TewesA.SchellbergJ. (2018). Towards remote estimation of radiation use efficiency in maize using UAV-based low-cost camera imagery.Agronomy Basel8:15. 10.3390/agronomy8020016

  • 89

    TianH.WangT.LiuY.QiaoX.LiY. (2020). Computer vision technology in agricultural automation —A review.Inform. Process. Agric.7119. 10.1016/j.inpa.2019.09.006

  • 90

    TianJ. Y.WangL.LiX. J.YinD. M.GongH. L.NieS.et al (2019). Canopy height layering biomass estimation model (chl-bem) with full-waveform lidar.Remote Sens.11:13. 10.3390/rs11121446

  • 91

    TillyN.HoffmeisterD.CaoQ.HuangS.Lenz-WiedemannV.MiaoY.et al (2014). Multitemporal crop surface models: accurate plant height measurement and biomass estimation with terrestrial laser scanning in paddy rice.J. Appl. Remote Sens.8:083671. 10.1117/1.Jrs.8.083671

  • 92

    TomasiC.KanadeT. (1992). Shape and motion from image streams under orthography: a factorization method.Int. J. Comput. Vis.9, 137154. 10.1007/BF00129684

  • 93

    TothC.JozkowG. (2016). Remote sensing platforms and sensors: a survey.ISPRS J. Photogram. Remote Sens.1152236. 10.1016/j.isprsjprs.2015.10.004

  • 94

    TuckerC. J. (1979). Red and photographic infrared linear combinations for monitoring vegetation.Remote Sens. Environ.8127150. 10.1016/0034-4257(79)90013-0

  • 95

    TurnerD.LucieerA.WatsonC. (2012). An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (sfm) point clouds.Remote Sens.413921410. 10.3390/rs4051392

  • 96

    ViljanenN.HonkavaaraE.NasiR.HakalaT.NiemelainenO.KaivosojaJ. (2018). A novel machine learning method for estimating biomass of grass swards using a photogrammetric canopy height model, images and vegetation indices captured by a drone.Agric. Basel8:28. 10.3390/agriculture8050070

  • 97

    VillosladaM.BergamoT. F.WardR. D.BurnsideN. G.JoyceC. B.BunceR. G. H.et al (2020). Fine scale plant community assessment in coastal meadows using UAV based multispectral data.Ecol. Indic.111:13. 10.1016/j.ecolind.2019.105979

  • 98

    VogelS.GebbersR.OertelM.KramerE. (2019). Evaluating soil-borne causes of biomass variability in grassland by remote and proximal sensing.Sensors19:4593. 10.3390/s19204593

  • 99

    WallaceL.LucieerA.WatsonC.TurnerD. (2012). Development of a UAV-LiDAR system with application to forest inventory.Remote Sens.415191543. 10.3390/rs4061519

  • 100

    WangC.NieS.XiX.LuoS.SunX. (2017). Estimating the biomass of maize with hyperspectral and LiDAR data.Remote Sens.9:11. 10.3390/rs9010011

  • 101

    WangD. Z.WanB.LiuJ.SuY. J.GuoQ. H.QiuP. H.et al (2020). Estimating aboveground biomass of the mangrove forests on northeast Hainan Island in China using an upscaling method from field plots, UAV-LiDAR data and Sentinel-2 imagery.Int. Journal of Appl. Earth Observ. Geoinform.85:16. 10.1016/j.jag.2019.101986

  • 102

    WangD. Z.WanB.QiuP. H.ZuoZ. J.WangR.WuX. C. (2019). Mapping height and aboveground biomass of mangrove forests on Hainan Island using UAV-LiDAR sampling.Remote Sens.11:25. 10.3390/rs11182156

  • 103

    WangF.WangF.ZhangY.HuJ.HuangJ.XieJ. (2019). Rice yield estimation using parcel-level relative spectra variables from UAV-based hyperspectral imagery.Front. Plant Sci.10:453. 10.3389/fpls.2019.00453

  • 104

    WijesinghaJ.MoeckelT.HensgenF.WachendorfM. (2019). Evaluation of 3D point cloud-based models for the prediction of grassland biomass.Int. J. Appl. Earth Observ. Geoinform.78352359. 10.1016/j.jag.2018.10.006

  • 105

    WongT.YangN. (2017). Dependency analysis of accuracy estimates in k-fold cross validation.IEEE Trans. Knowl. Data Eng.2924172427. 10.1109/TKDE.2017.2740926

  • 106

    WongT.YehP. (2020). Reliable accuracy estimates from k-fold cross validation.IEEE Trans. Knowl. Data Eng.3215861594. 10.1109/TKDE.2019.2912815

  • 107

    WongT.-T. (2015). Performance evaluation of classification algorithms by k-fold and leave-one-out cross validation.Pattern Recogn.4828392846. 10.1016/j.patcog.2015.03.009

  • 108

    XuR.LiC.PatersonA. H. (2019). Multispectral imaging and unmanned aerial systems for cotton plant phenotyping.PLoS One14:e0205083. 10.1371/journal.pone.0205083

  • 109

    YanW. Q.GuanH. Y.CaoL.YuY. T.LiC.LuJ. Y. (2020). A self-adaptive mean shift tree-segmentation method using UAV LiDAR data.Remote Sens.1214. 10.3390/rs12030515

  • 110

    YangB. H.WangM. X.ShaZ. X.WangB.ChenJ. L.YaoX.et al (2019). Evaluation of aboveground nitrogen content of winter wheat using digital imagery of unmanned aerial vehicles.Sensors19:18. 10.3390/s19204416

  • 111

    YangS.FengQ.LiangT.LiuB.ZhangW.XieH. (2017). Modeling grassland above-ground biomass based on artificial neural network and remote sensing in the Three-River Headwaters Region.Remote Sens. Environ.204448455. 10.1016/j.rse.2017.10.011

  • 112

    YaoX.HuangY.ShangG.ZhouC.ChengT.TianY.et al (2015). Evaluation of six algorithms to monitor wheat leaf nitrogen concentration.Remote Sens.71493914966. 10.3390/rs71114939

  • 113

    YuanH.YangG.LiC.WangY.LiuJ.YuH.et al (2017). Retrieving soybean leaf area index from unmanned aerial vehicle hyperspectral remote sensing: analysis of rf, ann, and svm regression models.Remote Sens.9:309. 10.3390/rs9040309

  • 114

    YuanM.BurjelJ. C.IsermannJ.GoeserN. J.PittelkowC. M. (2019). Unmanned aerial vehicle-based assessment of cover crop biomass and nitrogen uptake variability.J. Soil Water Conserv.74350359. 10.2489/jswc.74.4.350

  • 115

    YueJ.FengH.JinX.YuanH.LiZ.ZhouC.et al (2018). A comparison of crop parameters estimation using images from UAV-mounted snapshot hyperspectral sensor and high-definition digital camera.Remote Sens.10:1138. 10.3390/rs10071138

  • 116

    YueJ.YangG.LiC.LiZ.WangY.FengH.et al (2017). Estimation of winter wheat above-ground biomass using unmanned aerial vehicle-based snapshot hyperspectral sensor and crop height improved models.Remote Sens.9:708. 10.3390/rs9070708

  • 117

    YueJ. B.YangG. J.TianQ. J.FengH. K.XuK. J.ZhouC. Q. (2019). Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices.ISPRS J. Photogram. Remote Sens.150226244. 10.1016/j.isprsjprs.2019.02.022

  • 118

    ZhaH. N.MiaoY. X.WangT. T.LiY.ZhangJ.SunW. C.et al (2020). Improving unmanned aerial vehicle remote sensing-based rice nitrogen nutrition index prediction with machine learning.Remote Sens.12:22. 10.3390/rs12020215

  • 119

    ZhangH.LiD. (2014). Applications of computer vision techniques to cotton foreign matter inspection: a review.Comput. Electron. Agric.1095970. 10.1016/j.compag.2014.09.004

  • 120

    ZhangL.ShaoZ.LiuJ.ChengQ. (2019). Deep learning based retrieval of forest aboveground biomass from combined lidar and landsat 8 data.Remote Sens.11:1459. 10.3390/rs11121459

  • 121

    ZhangN.RaoR. S. P.SalvatoF.HavelundJ. F.MollerI. M.ThelenJ. J.et al (2018). MU-LOC: machine-learning method for predicting mitochondrially localized proteins in plants.Front. Plant Sci.9:634. 10.3389/fpls.2018.00634

  • 122

    ZhengH.LiW.JiangJ.LiuY.ChengT.TianY.et al (2018). A comparative assessment of different modeling algorithms for estimating leaf nitrogen content in winter wheat using multispectral images from an unmanned aerial vehicle.Remote Sens.10:2026. 10.3390/rs10122026

  • 123

    ZhengH. B.ChengT.ZhouM.LiD.YaoX.TianY. C.et al (2019). Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery.Precis. Agric.20611629. 10.1007/s11119-018-9600-7

  • 124

    ZhongY. F.WangX. Y.XuY.WangS. Y.JiaT. Y.HuX.et al (2018). Mini-UAV-borne hyperspectral remote sensing from observation and processing to applications.IEEE Geosci. Remote Sens. Mag.64662. 10.1109/mgrs.2018.2867592

  • 125

    ZhuW. X.SunZ. G.PengJ. B.HuangY. H.LiJ.ZhangJ. Q.et al (2019). Estimating maize above-ground biomass using 3d point clouds of multi-source unmanned aerial vehicle data at multi-spatial scales.Remote Sens.11:22. 10.3390/rs11222678

  • 126

    ZhuY. H.ZhaoC. J.YangH.YangG. J.HanL.LiZ. H.et al (2019). Estimation of maize above-ground biomass based on stem-leaf separation strategy integrated with LiDAR and optical remote sensing data.PeerJ7:30. 10.7717/peerj.7593

Summary

Keywords

unmanned aerial systems, unmanned aerial vehicle, remote sensing, crop biomass, smart agriculture, precision agriculture

Citation

Wang T, Liu Y, Wang M, Fan Q, Tian H, Qiao X and Li Y (2021) Applications of UAS in Crop Biomass Monitoring: A Review. Front. Plant Sci. 12:616689. doi: 10.3389/fpls.2021.616689

Received

13 October 2020

Accepted

18 March 2021

Published

09 April 2021

Volume

12 - 2021

Edited by

Penghao Wang, Murdoch University, Australia

Reviewed by

Lea Hallik, University of Tartu, Estonia; Jian Ma, Sichuan Agricultural University, China

Updates

Copyright

*Correspondence: Xi Qiao, Yanzhou Li,

This article was submitted to Technical Advances in Plant Science, a section of the journal Frontiers in Plant Science

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Figures

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics