AUTHOR=Guo Yuqiang , Yuan Tinggang , Yang Mulin , Qiu Jinyu TITLE=Does the “learning effect” caused by digital devices exaggerate sports visual training outcomes? A systematic review and meta-analysis JOURNAL=Frontiers in Physiology VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/physiology/articles/10.3389/fphys.2025.1664572 DOI=10.3389/fphys.2025.1664572 ISSN=1664-042X ABSTRACT=ObjectiveDigital-based visual training (VT) is widely employed to improve visual-cognitive performance, yet its efficacy may be confounded by the “learning effect”.MethodsA systematic literature search was conducted across PubMed, Web of Science, MEDLINE, SPORTDiscus, and Cochrane Library, covering all studies published up to 8 May 2025. The search was limited to peer-reviewed articles written in English. Only randomized controlled trials (RCTs) that included both baseline and post-intervention measures of visual-cognitive performance were eligible. Subgroup analysis was conducted based on the presence or absence of task similarity between training and testing conditions, to assess potential bias introduced by the “learning effect”.ResultsThe search identified 3,798 articles, of which 33 RCTs involving 1,048 participants met the inclusion criteria for meta-analysis. VT was found to significantly improve visual attention, reaction time, decision-making time, decision-making accuracy, and eye–hand coordination. Subgroup analyses revealed that studies classified as “learning effect present” (LE+) consistently reported substantially larger effect sizes than those without (LE−). Significant between-group differences were observed for visual attention (SMD = 1.65 vs. 0.07; p = 0.00), reaction time (SMD = 2.66 vs. 0.50; p = 0.00), and decision-making accuracy (SMD = 1.46 vs. 0.62; p = 0.03), indicating that task similarity may artificially inflate performance outcomes.ConclusionThese findings indicate that observed improvements may reflect task familiarity rather than true cognitive enhancement. To improve evaluation validity, future studies should avoid task redundancy, incorporate retention testing, and adopt structurally distinct outcome measures.