AUTHOR=Chen Yingna , Li Feifan , Dai Zhuoheng , Liu Ying , Huang Shengsong , Cheng Qian TITLE=Supervised contrastive loss helps uncover more robust features for photoacoustic prostate cancer identification JOURNAL=Frontiers in Oncology VOLUME=Volume 15 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/oncology/articles/10.3389/fonc.2025.1592815 DOI=10.3389/fonc.2025.1592815 ISSN=2234-943X ABSTRACT=BackgroundPhotoacoustic spectral analysis has been demonstrated to be efficacious in the diagnosis of prostate cancer (PCa). With the incorporation of deep learning, its discrimination accuracy is progressively enhancing. Nevertheless, individual heterogeneity persists as a significant factor that impacts discrimination performance.ObjectiveExtracting more reliable features from intricate biological tissue and augmenting discrimination accuracy of the prostate cancer.MethodsSupervised contrastive learning is introduced to explore its performance in photoacoustic spectral feature extraction. Three distinct models, namely the CNN-based model, the supervised contrastive (SC) model, and the supervised contrastive loss adjust (SCL-adjust) model, have been compared, along with traditional feature extraction and machine learning-based methods.ResultsThe outcomes have indicated that the SCL-adjust model exhibits the optimal performance, its accuracy rate has increased by more than 10% compared with the traditional method. Besides, the features extracted from this model are more resilient, regardless of the presence of uniform or Gaussian noise and model transfer. Compared with CNN model, the transfer performance of the proposed model has improved by approximately 5%.ConclusionsSupervised contrast learning is integrated into photoacoustic spectrum analysis and its effectiveness is verified. A comprehensive analysis is conducted on the performance improvement of the proposed SCL-adjust model in photoacoustic prostate cancer diagnosis, its resistance to noise, and its adaptability to the data heterogeneity of different systems.