AUTHOR=Marzougui Afef , McGee Rebecca J. , Van Vleet Stephen , Sankaran Sindhuja TITLE=Remote sensing for field pea yield estimation: A study of multi-scale data fusion approaches in phenomics JOURNAL=Frontiers in Plant Science VOLUME=Volume 14 - 2023 YEAR=2023 URL=https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2023.1111575 DOI=10.3389/fpls.2023.1111575 ISSN=1664-462X ABSTRACT=Remote sensing applications using unmanned aerial systems (UAS) are prevalent for phenomics and precision agricultural applications. The high-resolution data of these applications can provide useful spectral characteristics of crops associated with performance traits such as seed yield. Recently, the availability of high-resolution satellite imagery has attracted plot-scale remote sensing applications, especially those associated with breeding programs. This study compared the features extracted from high-resolution satellite and UAS multispectral imagery (visible and near-infrared) to predict the seed yield from two diverse plot-scale field pea yield trials (advanced breeding and variety testing) using the random forest model. The multi-modal (spectral and textural features) and multi-scale (satellite and UAS) data fusion approaches were evaluated in terms of improvement in prediction accuracies across trials and time points (individual and combined time points). The multi-scale data fusion approaches involved image fusion (pan-sharpening of satellite imagery with UAS imagery using intensity-hue-saturation transformation and additive wavelet luminance proportional approaches) and feature fusion (combining extracted spectral features). The image fusion was compared to high-definition satellite data at 0.15 m/pixel. The major findings can be summarized as follows: (1) the inclusion of the texture features did not improve the model performance, (2) the performance of the model using spectral features from satellite imagery (original resolution) can provide similar results as UAS imagery, depending on the field pea yield trial under study and more importantly the growth stage, (3) the model performance improved after multi-scale, multiple time point feature fusion, (4) the features extracted from the imagery pan-sharpened using intensity-hue-saturation transformation (image fusion) showed better model performance than those with original satellite imagery or high definition imagery, and (5) the green normalized difference vegetation index and transformed triangular vegetation index were found to be key consistent features contributing to high performance of the model. The results from this study highlight the potential of high-resolution satellite imagery and data fusion approaches for phenotyping applications in plot-scale research.