The interplay between graph analytics and large language models (LLMs) represents a promising frontier for advancing knowledge representation and processing. Graphs are fundamental to representing relationships in data, making them critical for applications in recommendation systems, fraud detection, and social network analysis. Simultaneously, LLMs have demonstrated unprecedented success in natural language understanding and generation, transforming industries reliant on textual data. This convergence of graph-based methods and LLMs is beginning to redefine how we process and analyze vast amounts of interconnected, multimodal data. Recent advances in High-Performance Computing (HPC) have enabled this integration, but significant challenges remain in scaling models and optimizing workflows for heterogeneous, dynamic datasets.
This Research Topic explores the integration of graph analytics and LLMs to address emerging challenges in knowledge representation and processing. Traditional graph analytics techniques, needed for deep understanding and reasoning. On the other hand, LLMs excel at language-based tasks but struggle with explicit relational reasoning that graph-based methods handle naturally. We seek to explore solutions that bridge this gap, such as combining graph neural networks (GNNs) with LLMs to enhance reasoning over structured and unstructured data. Recent advances, such as pretraining LLMs on graph-structured data and using them for graph-related tasks like node classification and link prediction, show tremendous potential. Furthermore, leveraging HPC techniques, including GPU acceleration and distributed systems, can help scale these hybrid approaches for real-world applications like conversational AI, knowledge graph generation, and NLP-based recommendation systems.
We invite contributions that advance the integration of graph analytics and LLMs, focusing on interdisciplinary approaches that leverage HPC to tackle real-world challenges. Topics of interest include: • Innovations in integrating graph neural networks with LLMs. • Graph-structured data are pretraining for LLMs and their applications. • Using LLMs for graph tasks such as link prediction, node classification, and graph generation. • Applications of hybrid graph-LLM models in natural language understanding, recommendation systems, and knowledge graphs. • HPC-enabled optimizations for hybrid graph and LLM models, including GPU and distributed system implementations. • Addressing challenges in dynamic, multimodal, and large-scale graph analytics using LLMs.
We welcome original research articles, review papers, and application case studies demonstrating the potential of combining graph analytics with LLMs. Contributions that highlight scalable, efficient, and practical solutions to these challenges are particularly encouraged
Article types and fees
This Research Topic accepts the following article types, unless otherwise specified in the Research Topic description:
Brief Research Report
Community Case Study
Conceptual Analysis
Data Report
Editorial
FAIR² Data
General Commentary
Hypothesis and Theory
Methods
Articles that are accepted for publication by our external editors following rigorous peer review incur a publishing fee charged to Authors, institutions, or funders.
Article types
This Research Topic accepts the following article types, unless otherwise specified in the Research Topic description:
Brief Research Report
Community Case Study
Conceptual Analysis
Data Report
Editorial
FAIR² Data
General Commentary
Hypothesis and Theory
Methods
Mini Review
Opinion
Original Research
Perspective
Policy and Practice Reviews
Review
Study Protocol
Systematic Review
Technology and Code
Keywords: HPC, Graph Processing, Large Language Models, Graph Neural Networks, NLP Applications
Important note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.