Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Remote Sens.

Sec. Multi- and Hyper-Spectral Imaging

Volume 6 - 2025 | doi: 10.3389/frsen.2025.1666123

This article is part of the Research TopicAdvancing Biodiversity Assessment in Protected Areas through Remote Sensing TechniquesView all 4 articles

SenFus-CHCNet: A Multi-Resolution Fusion Framework for Sparse-Supervised Canopy Height Classification

Provisionally accepted
  • 1Faculty of Mathematics and Informatics, Hanoi University of Science and Technology, Hanoi, Vietnam
  • 2VinUniversity, Hanoi, Vietnam

The final, formatted version of the article will be published soon.

Accurate forest canopy height mapping is critical for understanding ecosystem structure, monitoring biodiversity, and supporting climate change mitigation strategies. In this paper, we present SenFus-CHCNet, a novel deep learning architecture designed to produce high-resolution canopy height classification maps by fusing multispectral (Sentinel-2) and synthetic aperture radar (SAR) (Sentinel-1) imagery with GEDI LiDAR data. The proposed model comprises two main components: a Multi-source and Multi-band Fusion Module that effectively inte-grates data of varying spatial resolutions through resolution-aware embedding and aggregation, and a Pixel-wise Classification Module based on a customized U-Net architecture optimized for sparse supervision. To discretize continuous canopy height values, we evaluate three classification schemes—coarse, medium, and fine-grained—each balancing ecological interpretability with model learning efficiency. Extensive experiments conducted over complex forested landscapes in northern Vietnam demonstrate that SenFus-CHCNet outperforms state-of-the-art baselines, including both convolutional and transformer-based models, achieving up to 4.5% improvement in relaxed accuracy (RA±1) and 10% gain in F1-score. Qualitative evaluations confirm that the predicted maps preserve fine-scale structural detail and ecologically meaningful spatial patterns, even in 1 regions with sparse GEDI coverage. Our findings highlight the effectiveness of deep fusion learning for canopy height estimation, particularly in resource-limited settings. SenFus-CHCNet provides a scalable and interpretable approach for forest monitoring at regional and national scales, with promising implications for biodiversity conservation, carbon accounting, and land-use planning.

Keywords: Canopy height estimation, Pixel-wise classification, multi-resolutionfusion, sparse supervision, GEDI, Sentinel-1, Sentinel-2

Received: 15 Jul 2025; Accepted: 26 Sep 2025.

Copyright: © 2025 Bui, Nguyen Vi, Vu-Duc and Kamel. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Nidal Kamel, nidal.k@vinuni.edu.vn

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.