AI for the Next Generation of Particle Physics Experiments

About this Research Topic

Submission deadlines

  1. Manuscript Summary Submission Deadline 18 May 2026 | Manuscript Submission Deadline 5 September 2026

  2. This Research Topic is currently accepting articles.

Background

The field of particle physics is approaching a critical horizon defined by challenges including unprecedented data volumes and detector complexity. Upcoming collider and neutrino experiments will produce exabytes of data, straining the limits of traditional storage, processing, and analysis pipelines, and will also demand sophisticated simulations that require substantial and costly computing resources. Simultaneously, searches for dark matter and new physics require sensitivity to increasingly rare and subtle signals, motivating both highly optimized targeted searches and model-agnostic, anomaly-based approaches that go beyond traditional theory-by-theory probes.

While Machine Learning (ML) has been successfully applied to offline analysis, its potential remains largely untapped in upstream areas. The particle physics experiments of the future are being designed now, and will operate in a fundamentally different way from the present; the next paradigm shift lies in moving AI/ML from a post-processing tool to a foundational element of experimental architecture, embedded directly into detector design, real-time data acquisition, and rigorous statistical inference. This integration is essential to navigate the "data deluge" and maximize the discovery potential of future facilities.



The primary goal of this Research Topic is to aggregate cutting-edge methodologies that integrate AI/ML into the full lifecycle of particle physics experiments. This means more than just AI/ML for classification tasks; we would like to explore how modern techniques (e.g. differentiable programming, surrogate modeling) can optimize the physical design of detectors and accelerators before they are even built.

We also seek contributions that redefine simulation and analysis workflows, including novel pipeline architectures, agentic systems, and foundation models tailored to scientific data. These approaches offer new paradigms for large-scale data generation, processing, and interpretation.

Another focus of this collection is AI-driven solutions to real-time experimental challenges, such as nanosecond-latency inference on FPGAs for hardware triggers at the HL-LHC and autonomous data selection for high-throughput neutrino streams at DUNE.

Equally important is bridging the gap between "black box" predictions and rigorous physics results. We specifically encourage contributions that address the quantification of aleatoric and epistemic uncertainties within ML models, ensuring that AI-enhanced measurements maintain the statistical rigor required for precision Standard Model tests.

By connecting instrument design, autonomous operations, and advanced statistical inference, this edition aims to define a roadmap for the "self-driving" particle physics experiments of the future.



We welcome original research covering AI/ML applications across the full experimental pipeline.

Key areas of interest include:

AI for Instrument Design: Differentiable simulation and Bayesian optimization for e.g., detector geometry, accelerator tuning and control for proposed future colliders

Intelligent Data Acquisition: Real-time edge ML, FPGA deployment, and smart trigger systems for high-rate environments (such as, but not limited to, HL-LHC or DUNE).

Analysis & Reconstruction: Neural Networks for essential analysis components such as, but not limited to, tracking, clustering, simulation, particle identification and strategies for anomaly detection.

AI-enabled Workflows: New approaches to the practice of particle physics discovery, including optimizable (e.g. differentiable) pipelines and efforts towards autonomous science (e.g. probabilistic, agentic discovery platforms)

Foundation Models: Development and practical use of large scale, pre-trained models for particle physics

Uncertainty Quantification: Methods for reducing systematic errors, rigorous statistical interpretation of ML outputs, and unfolding techniques.

Authors are encouraged to demonstrate applicability to major upcoming experiments . Manuscripts must bridge the gap between computer science innovation and physical application

Article types and fees

This Research Topic accepts the following article types, unless otherwise specified in the Research Topic description:

  • Brief Research Report
  • Clinical Trial
  • Community Case Study
  • Conceptual Analysis
  • Data Report
  • Editorial
  • FAIR² Data
  • FAIR² DATA Direct Submission
  • General Commentary

Articles that are accepted for publication by our external editors following rigorous peer review incur a publishing fee charged to Authors, institutions, or funders.

Keywords: Artificial intelligence in particle physics, Machine learning, Differentiable programming, Surrogate modeling, Intelligent detector design, Real-time inference, FPGA acceleration, Autonomous experimental systems, Uncertainty quantification, Statistical in

Important note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Topic editors

Manuscripts can be submitted to this Research Topic via the main journal or any other participating journal.