ORIGINAL RESEARCH article
Front. Artif. Intell.
Sec. Medicine and Public Health
This article is part of the Research TopicAdvances in Artificial Intelligence for Early Cancer Detection and Precision OncologyView all 6 articles
Rectal Cancer Segmentation via HHF-SAM: A Hierarchical Hypercolumn-Guided Fusion Segment Anything Model
Provisionally accepted- 1Department of Pathology,the First Hospital of China Medical University, Shenyang, China
- 2Department of Nephrology, the Fourth People's Hospital of Shenyan, Shenyang, China
- 3Department of Neonatology, Guangzhou Key Laboratory of Neonatal Intestinal Diseases, The Third Affiliated Hospital of Guangzhou Medical University, Guangzhou, China
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Rectal cancer is one of the most common cancers worldwide. Segmenting rectal cancer in abdominal CT images can assist doctors in making more accurate decisions in complex or uncertain cases, thereby improving overall diagnostic accuracy. Due to the low contrast between different tissues and the presence of significant noise in medical images, existing segmentation methods often struggle to accurately delineate the boundaries of lesion areas. To meet the growing demand for more refined segmentation techniques, this study proposes a novel framework, Hierarchical Hypercolumn-guided Fusion Segment Anything Model, called HHF-SAM. This framework fully leverages the SAM model’s strong understanding of natural images and enhances its adaptability to medical-specific features through a specially designed adapter module. Given the varying sizes and shapes of lesion areas, we introduced the Multi-scale Hypercolumn Processing Module, which generates multi-scale features to provide comprehensive guidance for the decoder. Additionally, the low contrast that blurs the boundaries between cancerous and healthy regions makes it challenging for the simple convolutional decoder in the original SAM model to accurately differentiate the complex details of lesions. To address this, we proposed the Progressive Hierarchical Fusion Decoder, which can aggregate multi-scale features and predict the final segmentation results. Finally, we introduced a multi-level supervision mechanism to further optimize network parameters. Performance tests on two large public abdominal CT datasets have demonstrated that our HHF-SAM framework outperforms other typical segmentation methods in terms of segmentation accuracy. The code will be available for model reproduction.
Keywords: deep learning, Medical Image Analysis, multi-scale, Rectal Cancer Segmentation, Segment Anything Model
Received: 01 Sep 2025; Accepted: 08 Dec 2025.
Copyright: © 2025 Ye, Ying, Xiaohong, Zhoushan and Congcong. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Feng Zhoushan
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
