AUTHOR=Bhattacharya Siddhartha , Wasit Aarham , Earles J Mason , Nitin Nitin , Yi Jiyoon TITLE=Enhancing AI microscopy for foodborne bacterial classification using adversarial domain adaptation to address optical and biological variability JOURNAL=Frontiers in Artificial Intelligence VOLUME=Volume 8 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/artificial-intelligence/articles/10.3389/frai.2025.1632344 DOI=10.3389/frai.2025.1632344 ISSN=2624-8212 ABSTRACT=AI-enabled microscopy is emerging for rapid bacterial classification, yet its utility remains limited in dynamic or resource-limited settings due to imaging variability. This study aims to enhance the generalizability of AI microscopy using domain adaptation techniques. Six bacterial species, including three Gram-positive (Bacillus coagulans, Bacillus subtilis, Listeria innocua) and three Gram-negative (Escherichia coli, Salmonella Enteritidis, Salmonella Typhimurium), were grown into microcolonies on soft tryptic soy agar plates at 37°C for 3–5 h. Images were acquired under varying microscopy modalities and magnifications. Domain-adversarial neural networks (DANNs) addressed single-target domain variations and multi-DANNs (MDANNs) handled multiple domains simultaneously. EfficientNetV2 backbone provided fine-grained feature extraction suitable for small targets, with few-shot learning enhancing scalability in data-limited domains. The source domain contained 105 images per species (n = 630) collected under optimal conditions (phase contrast, 60 × magnification, 3-h incubation). Target domains introduced variations in modality (brightfield, BF), lower magnification (20 × ), and extended incubation (20x-5h), each with < 5 labeled training images per species (n ≤ 30) and test datasets of 60–90 images. DANNs improved target domain classification accuracy by up to 54.5% for 20 × (34.4% to 88.9%), 43.3% for 20x-5h (40.0% to 83.3%), and 31.7% for BF (43.4% to 73.3%), with minimal accuracy loss in the source domain. MDANNs further improved accuracy in the BF domain from 73.3% to 76.7%. Feature visualizations by Grad-CAM and t-SNE validated the model's ability to learn domain-invariant features across conditions. This study presents a scalable and adaptable framework for bacterial classification, extending the utility of microscopy to decentralized and resource-limited settings where imaging variability often challenges performance.