J* E* C* N* U* N* S* ›› 2025, Vol. 2025 ›› Issue (1): 111-123.doi: 10.3969/j.issn.1000-5641.2025.01.009

• Computer Science • Previous Articles    

Knowledge graph completion by integrating textual information and graph structure information

Houlong FAN, Ailian FANG, Xin LIN*()   

  1. School of Computer Science and Technology, East China Normal University, Shanghai 200062, China
  • Received:2024-01-02 Online:2025-01-25 Published:2025-01-20
  • Contact: Xin LIN E-mail:xlin@cs.ecnu.edu.cn

Abstract:

Based upon path query information, we propose a graph attention model that effectively integrates textual and graph structure information in knowledge graphs, thereby enhancing knowledge graph completion. For textual information, a dual-encoder based on pre-trained language models is utilized to separately obtain embedding representations of entities and path query information. Additionally, an attention mechanism is employed to aggregate path query information, which is used to capture graph structural information and update entity embeddings. The model was trained using contrastive learning and experiments were conducted on multiple knowledge graph datasets, with good results achieved in both transductive and inductive settings. These results demonstrate the advantage of combining pre-trained language models with graph neural networks to effectively capture both textual and graph structural information, thereby enhancing knowledge graph completion.

Key words: knowledge graph completion, pre-trained language model, contrastive learning, graph neural networks

CLC Number: