Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Artif. Intell.

Sec. AI in Finance

Volume 8 - 2025 | doi: 10.3389/frai.2025.1616485

LiT: Limit Order Book Transformer

Provisionally accepted
Yue  XiaoYue Xiao1*Carmine  VentreCarmine Ventre1Yuhan  WangYuhan Wang2Haochen  LiHaochen Li1Yuxi  HuanYuxi Huan3Buhong  LiuBuhong Liu4
  • 1King's College London, London, United Kingdom
  • 2Birkbeck, University of London, London, United Kingdom
  • 3University College London, London, England, United Kingdom
  • 4SOAS University of London, London, United Kingdom

The final, formatted version of the article will be published soon.

While the transformer architecture has demonstrated strong success in natural language processing and computer vision, its application to limit order book forecasting, particularly in capturing spatial and temporal dependencies, remains limited. In this work, we introduce Limit Order Book Transformer (LiT), a novel deep learning architecture for forecasting short-term market movements using high-frequency limit order book data. Unlike previous approaches that rely on convolutional layers, LiT leverages structured patches and transformer-based selfattention to model spatial and temporal features in market microstructure dynamics. We evaluate LiT on multiple LOB datasets across different prediction horizons, LiT consistently outperforms traditional machine learning methods and state-of-the-art deep learning baselines. Furthermore, we show that LiT maintains robust performance under distributional shifts via fine-tuning, making it a practical solution for fast-paced and dynamic financial environments.

Keywords: transformers, deep learning, Limit order book, High-frequency trading, market microstructure, representation learning, Transfer Learning

Received: 22 Apr 2025; Accepted: 28 Aug 2025.

Copyright: © 2025 Xiao, Ventre, Wang, Li, Huan and Liu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Yue Xiao, King's College London, London, United Kingdom

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.