ORIGINAL RESEARCH article
Front. Physiol.
Sec. Gastrointestinal Sciences
Volume 16 - 2025 | doi: 10.3389/fphys.2025.1666311
This article is part of the Research TopicNext-Generation Technologies in Assessing Gastrointestinal Health and DiseaseView all articles
Development of a Convolutional Neural Network-Based AI-Assisted Multi-Task Colonoscopy Withdrawal Quality Control System (with Video)
Provisionally accepted- 1Changshu No.1 People's Hospital, Suzhou, China
- 2First People's Hospital of Changshu City, Changshu, China
- 3Changshu Hospital of Traditional Chinese Medicine, Changshu, China
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Background Colonoscopy is a crucial method for the screening and diagnosis of colorectal cancer, with the withdrawal phase directly impacting the adequacy of mucosal inspection and the detection rate of lesions. This study establishes a convolutional neural network-based artificial intelligence system for multitask withdrawal quality control, encompassing monitoring of withdrawal speed, total withdrawal time, and effective withdrawal time. Methods This study integrated colonoscopy images and video data from three medical centers, annotated into three categories: ileocecal part, instrument operation, and normal mucosa. The model was built upon the pre-trained YOLOv11 series networks, employing transfer learning and fine-tuning strategies. Evaluation metrics included accuracy, precision, sensitivity, and the area under the curve (AUC). Based on the best-performing model, the Laplacian operator was applied to automatically identify and eliminate blurred frames, while a perceptual hash algorithm was utilized to monitor withdrawal speed in real time. Ultimately, a multitask withdrawal quality control system—EWT-SpeedNet—was developed, and its effectiveness was preliminarily validated through human-machine comparison experiments. Results Among the four YOLOv11 models, YOLOv11m demonstrated the best performance, achieving an accuracy of 96.00% and a precision of 96.38% on the validation set, both surpassing those of the other models. On the test set, its weighted average precision, sensitivity, specificity, F1 score, accuracy, and AUC reached 96.58%, 96.44%, 97.64%, 96.38%, 96.44%, and 0.9975, respectively, with an inference speed of 86.78 FPS. Grad-CAM visualizations revealed that the model accurately focused on key mucosal features. In human-machine comparison experiments involving 48 colonoscopy videos, the AI system exhibited a high degree of consistency with expert endoscopists in measuring EWT (ICC = 0.969, 95% CI: 0.941 - 0.984; r = 0.972, p < 0.001), though with a slight underestimation (Bias = -11.1 s, 95% LoA: -70.5 to 48.3 s). Conclusion The EWT-SpeedNet withdrawal quality control system we developed enables real-time visualization of withdrawal speed during colonoscopy and automatically calculates both the total and effective withdrawal times, thereby supporting standardized and efficient procedure monitoring.
Keywords: artificial intelligence, YOLO, Colonoscopy, effective withdrawal time, Colonoscope withdrawal speed, blurdetection
Received: 15 Jul 2025; Accepted: 29 Sep 2025.
Copyright: © 2025 Chen, Zhu, Shen, Xia, Xu and Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Xiaodan Xu, xxddocter@gmail.com
Ganhong Wang, 651943259@qq.com
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.