AUTHOR=Wu Yizhen , Qiao Yuling , Wu Licheng , Gao Minglin , Wong Tsz Yiu , Li Jingyun , Wang Zhimeng , Zhao Xu , Zhao Hui , Fan Xiwang TITLE=A virtual reality-based multimodal framework for adolescent depression screening using machine learning JOURNAL=Frontiers in Psychiatry VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/psychiatry/articles/10.3389/fpsyt.2025.1655554 DOI=10.3389/fpsyt.2025.1655554 ISSN=1664-0640 ABSTRACT=BackgroundMajor depressive disorder (MDD) in adolescents poses an increasing global health concern, yet current screening practices rely heavily on subjective reports. Virtual reality (VR), integrated with multimodal physiological sensing (EEG+ET+HRV), offers a promising pathway for more objective diagnostics.MethodsIn this case-control study, 51 adolescents diagnosed with first-episode MDD and 64 healthy controls participated in a 10-minute VR-based emotional task. Electroencephalography (EEG), eye-tracking (ET), and heart rate variability (HRV) data were collected in real-time. Key physiological differences were identified via statistical analysis, and a support vector machine (SVM) model was trained to classify MDD status based on selected features.ResultsAdolescents with MDD showed significantly higher EEG theta/beta ratios, reduced saccade counts, longer fixation durations, and elevated HRV LF/HF ratios (all p <.05). The theta/beta and LF/HF ratios were both significantly associated with depression severity. The SVM model achieved 81.7% classification accuracy with an AUC of 0.921.ConclusionsThe proposed VR-based multimodal system identified robust physiological biomarkers associated with adolescent MDD and demonstrated strong diagnostic performance. These findings support the utility of immersive, sensor-integrated platforms in early mental health screening and intervention. Future work may explore integrating the proposed multimodal system into wearable or mobile platforms for scalable, real-world mental health screening.