Advanced Search
Chuan Shi, Jinyu Yang. Graph Learning in the Era of Foundation Models[J]. Computing Magazine of the CCF, 2026, 2(1): 20−25. DOI: 10.11991/cccf.202601004
Citation: Chuan Shi, Jinyu Yang. Graph Learning in the Era of Foundation Models[J]. Computing Magazine of the CCF, 2026, 2(1): 20−25. DOI: 10.11991/cccf.202601004

Graph Learning in the Era of Foundation Models

  • Graph-structured data arise widely in social networks, transportation systems, and biological domains. graph neural networks (GNNs) leverage message-passing mechanism to aggregate neighborhood information and achieve strong performance on node classification, link prediction, and graph classification tasks. However, with growing data scale and increasingly complex application scenarios, GNNs face inherent limitations in expressiveness and generalization. Recent progress in foundation models, particularly large language models (LLMs), has revealed remarkable capabilities in generalization and reasoning, inspiring new paradigms for graph machine learning. Building on this inspiration, the concept of graph foundation models (GFMs) is proposed to develop general-purpose models pretrained on large-scale graph corpora and adaptable to diverse downstream tasks. This article systematically reviews recent advances in GFMs, categorizes existing approaches by their reliance on GNNs and LLMs, and summarizes our practical experience in related developments. Finally, we outline key challenges and promising research directions to guide future work.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return