REVIEW article

Front. Comput. Sci.

Sec. Networks and Communications

Volume 7 - 2025 | doi: 10.3389/fcomp.2025.1538277

Intelligent Data Analysis in Edge Computing with Large Language Models: Applications, Challenges, and Future Directions

Provisionally accepted
Xuanzheng  WangXuanzheng Wang*Xu  ZhipengXu ZhipengXingfei  SuiXingfei Sui
  • CNOOC Safety & Technology Services Co., Ltd., Tianjin, China

The final, formatted version of the article will be published soon.

Edge computing has emerged as a vital paradigm for processing data near its source, significantly reducing latency and improving data privacy. Simultaneously, large language models (LLMs) such as GPT-4 and BERT have showcased impressive capabilities in data analysis, natural language processing, and decision-making. This survey explores the intersection of these two domains, specifically focusing on the adaptation and optimization of LLMs for data analysis tasks in edge computing environments. We examine the challenges faced by resource-constrained edge devices, including limited computational power, energy efficiency, and network reliability.Additionally, we discuss how recent advancements in model compression, distributed learning, and edge-friendly architectures are addressing these challenges. Through a comprehensive review of the current research, we analyze the applications, challenges, and future directions of deploying LLMs in edge computing. This analysis aims to facilitate intelligent data analysis across various industries, including healthcare, smart cities, and the internet of things.

Keywords: Edge computing, Large language models, Intelligent data analysis, Resource-constrained devices, edge-friendly architectures

Received: 02 Dec 2024; Accepted: 17 Apr 2025.

Copyright: © 2025 Wang, Zhipeng and Sui. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Xuanzheng Wang, CNOOC Safety & Technology Services Co., Ltd., Tianjin, China

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.