AUTHOR=Chang Ande , Ji Yuting , Bie Yiming TITLE=Transformer-based short-term traffic forecasting model considering traffic spatiotemporal correlation JOURNAL=Frontiers in Neurorobotics VOLUME=Volume 19 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/neurorobotics/articles/10.3389/fnbot.2025.1527908 DOI=10.3389/fnbot.2025.1527908 ISSN=1662-5218 ABSTRACT=Traffic forecasting is crucial for a variety of applications, including route optimization, signal management, and travel time estimation. However, many existing prediction models struggle to accurately capture the spatiotemporal patterns in traffic data due to its inherent nonlinearity, high dimensionality, and complex dependencies. To address these challenges, a short-term traffic forecasting model, Trafficformer, is proposed based on the Transformer framework. The model first uses a multilayer perceptron to extract features from historical traffic data, then enhances spatial interactions through Transformer-based encoding. By incorporating road network topology, a spatial mask filters out noise and irrelevant interactions, improving prediction accuracy. Finally, traffic speed is predicted using another multilayer perceptron. In the experiments, Trafficformer is evaluated on the Seattle Loop Detector dataset. It is compared with six baseline methods, with Mean Absolute Error, Mean Absolute Percentage Error, and Root Mean Square Error used as metrics. The results show that Trafficformer not only has higher prediction accuracy, but also can effectively identify key sections, and has great potential in intelligent traffic control optimization and refined traffic resource allocation.