AUTHOR=Deng Hongfei , Wen Bin , Gu Cheng , Fan Yingjie TITLE=GrotUNet: a novel leaf segmentation method JOURNAL=Frontiers in Plant Science VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2025.1378958 DOI=10.3389/fpls.2025.1378958 ISSN=1664-462X ABSTRACT=In the field of biology, the current leaf segmentation method still has problems such as missed inspections and duplication in the number of large, dense, mutual obstruction and vague division tasks. The reason for the above is that image semantic extraction is not satisfactory and semantic parsing is still insufficient. To address the above problems, this paper proposes GrotUNet, a novel leaf segmentation method that can be trained end-to-end. The algorithm is reconstructed in three aspects: semantic feature coding, hopping connectivity, and multiscale upsampling fusion. The semantic coding structure consists of GRblock, WGRblock, and OTblock modules. The former two make full use of the design ideas of GoogLeNet parallel branching and Resnet residual connectivity, while the latter further mines the fine-grained semantic information distributed in the feature space on the feature map after extraction by the WGRblock module to make the feature expression richer. Unlike UNet++ dense connectivity, jump connection reconstruction only uses 1×1  convolution for feature fusion of feature maps from different network hierarchies to enrich the semantic information at each location in the space. The multi-scale upsampling fusion design mechanism incorporates higher-order feature maps into each shallow decoding sub-network, effectively mitigating the loss of semantic parsing information of feature maps. In this paper, the method is fully demonstrated on CVPPP, KOMATSUNA and MSU-PID datasets. The experimental results show that GrotUNet segmentation outperforms existential UNet, ResUNet, UNet++, Perspective + UNet and other methods. Compared with UNet++, GrotUNet improves the key evaluation metrics (SBD) by 0.57%, 0.30%, and 0.27%, respectively.