Your new experience awaits. Try the new design now and help us make it even better

REVIEW article

Front. Oncol.

Sec. Gynecological Oncology

Volume 15 - 2025 | doi: 10.3389/fonc.2025.1592078

This article is part of the Research TopicRadiomics and Artificial Intelligence in Oncology ImagingView all 21 articles

Prospects and challenges of deep learning in gynaecological malignancies

Provisionally accepted
  • University-Town Hospital of Chongqing Medical University, Chongqing, China

The final, formatted version of the article will be published soon.

Artificial intelligence (AI) is revolutionizing oncology, with deep learning (DL) emerging as a pivotal technology for addressing gynecological malignancies (GMs). DL-based models are now widely applied to assist in clinical diagnosis and prognosis prediction, demonstrating excellent performance in tasks such as tumor detection, segmentation, classification, and necrosis assessment for both primary and metastatic GMs. By leveraging radiological (e.g., X-ray, CT, MRI, SPECT) and pathological images, these approaches show significant potential for enhancing diagnostic accuracy and prognostic evaluation. This review provides a concise overview of deep learning techniques for medical image analysis and their current applications in GM diagnosis and outcome prediction. Furthermore, it discusses key challenges and future directions in the field. AI-based radiomics presents a non-invasive and cost-effective tool for gynecological practice, and the integration of multi-omics data is recommended to further advance precision medicine in oncology.

Keywords: Artificial intelligence (AI), Machine Learning (ML), deep learning (DL), Gynaecological Malignancies (GM), prospects and challenges

Received: 12 Mar 2025; Accepted: 23 Sep 2025.

Copyright: © 2025 Zhang and Qin. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: qin Qin, fengwonderful@139.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.