ORIGINAL RESEARCH article
Front. Signal Process.
Sec. Image Processing
Volume 5 - 2025 | doi: 10.3389/frsip.2025.1672569
This article is part of the Research TopicUnveiling the Decision Veil: Explainable AI in Medical ImagingView all articles
ASG-MammoNet: An Attention-Guided Framework for Streamlined and Interpretable Breast Cancer Classification from Mammograms
Provisionally accepted- 1Brunel University of London, Department of Electronic and Electrical Engineering, Uxbridge UB8 3PH, United Kingdom
- 2School of Information and Communication Engineering, North University of China, Taiyuan 030051, China
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Breast cancer remains the most frequently diagnosed cancer and a leading cause of cancer-related death among women globally, emphasising the urgent need for early, accurate, and interpretable diagnostic tools. While digital mammography serves as the cornerstone of breast cancer screening, its diagnostic performance is often hindered by image quality variability, dense breast tissue, and limited visual interpretability. Furthermore, conventional Computer-Aided Diagnostic (CAD) systems and deep learning models have struggled with clinical adoption due to high false positive rates, difficult decision-making, and excessive computational demands. To address these critical challenges, we introduce ASG-MammoNet, an Attention-Guided and Streamlined deep learning framework for robust, real-time, and explainable mammographic breast cancer classification. The framework is composed of three integrated stages: (1) Data Preparation and Balanced Feature Representation, which applies advanced preprocessing, augmentation, and weighted sampling to mitigate data imbalance and variations across the dataset; (2) Attention-Guided Streamlined Classification, where an EfficientNet-B0 backbone is enhanced by a dual-stage Convolutional Block Attention Module (CBAM) to selectively emphasise diagnostically relevant features; and (3) Explainable Inference, in which Gradient-weighted Class Activation Mapping (Grad-CAM) is employed to provide class-specific visualisations of lesion regions, supporting interpretability and clinical decision-making. In this study, ASG-MammoNet is thoroughly validated on three benchmark mammography datasets, CBIS-DDSM, INbreast, and MIAS, achieving accuracy above 99.1%, AUC scores exceeding 99.6%, and DIP (Distance from Ideal Position) scores above 0.99, with an average inference time under 14 milliseconds per image. The framework exhibits strong generalisability, consistent performance across data folds, and clinically relevant attention maps, highlighting its readiness for real-world deployment. The model consistently outperforms or matches recent state-of-the-art approaches while offering superior balance across sensitivity and specificity. Its robust generalisability, consistent fold-wise performance, and clinically meaningful attention visualisations support its practical utility. By addressing critical limitations such as high computational cost, limited interpretability, and precision, ASG-MammoNet represents a practical and reliable solution for AI-assisted breast cancer diagnosis in modern screening settings.
Keywords: breast cancer diagnosis, computer-aided diagnosis, deep learning, EfficientNet, convolutional block attention module (CBAM), Gradient-weighted class activation mapping (GRAD-CAM), Mammography
Received: 24 Jul 2025; Accepted: 15 Oct 2025.
Copyright: © 2025 Ahmed and Nandi. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Asoke K Nandi, asoke.nandi@brunel.ac.uk
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.