AUTHOR=Li Jinxing , Bortnik Jacob , Wang Qiushuo , Wu Yingnian , Lizarraga Andrew , Angel Mirana , Wang Beibei , Wen Qianzhuang , Jiang Jeffrey TITLE=Modeling ring current proton distribution using MLP, CNN, LSTM, and transformer networks JOURNAL=Frontiers in Astronomy and Space Sciences VOLUME=Volume 12 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/astronomy-and-space-sciences/articles/10.3389/fspas.2025.1629056 DOI=10.3389/fspas.2025.1629056 ISSN=2296-987X ABSTRACT=This study aims at developing ring current proton flux models using four neural network architectures: a multilayer perceptron (MLP), a convolutional neural network (CNN), a long short-term memory (LSTM) network, and a Transformer network. All models take time sequences of geomagnetic indices as inputs. Experimental results demonstrate that the LSTM and Transformer models consistently outperform the MLP and CNN models by achieving lower mean squared errors on the test set, possibly due to their intrinsic capability to process temporal sequential input data. Unlike MLP and CNN models, which require a fixed input history length even though proton lifetime varies with altitude, the LSTM and Transformer models accommodate variable-length sequences during both training and inference. Our findings indicate that the LSTM and Transformer architectures are well suited for modeling ring current proton behavior when GPU resources are available, and the Transformer slightly underperforms the LSTM model due to the restriction on the number of total heads. For resource-constrained environments, however, the MLP model offers a practical alternative, with faster training and inference times, while maintaining competitive accuracy.