华东师范大学学报(自然科学版) ›› 2025, Vol. 2025 ›› Issue (1): 111-123.doi: 10.3969/j.issn.1000-5641.2025.01.009

• 计算机科学 • 上一篇    

文本信息与图结构信息相融合的知识图谱补全

范厚龙, 房爱莲, 林欣*()   

  1. 华东师范大学 计算机科学与技术学院, 上海 200062
  • 收稿日期:2024-01-02 出版日期:2025-01-25 发布日期:2025-01-20
  • 通讯作者: 林欣 E-mail:xlin@cs.ecnu.edu.cn

Knowledge graph completion by integrating textual information and graph structure information

Houlong FAN, Ailian FANG, Xin LIN*()   

  1. School of Computer Science and Technology, East China Normal University, Shanghai 200062, China
  • Received:2024-01-02 Online:2025-01-25 Published:2025-01-20
  • Contact: Xin LIN E-mail:xlin@cs.ecnu.edu.cn

摘要:

提出了一种基于路径查询信息的图注意力模型, 可以将知识图谱中的文本信息与图结构信息有效融合, 进而提高知识图谱的补全效果. 对于文本信息, 使用基于预训练语言模型的双编码器来分别获得实体的嵌入表示和路径查询信息的嵌入表示. 通过注意力机制来进行路径查询信息的聚合, 以捕获图结构信息, 更新实体的嵌入表示. 模型使用对比学习进行训练, 在多个知识图谱数据集上进行实验, 如直推式、归纳式的方式, 都取得了良好的效果. 结果表明, 将预训练语言模型与图神经网络的优势相结合, 可以有效捕获知识图谱中文本信息与图结构信息, 进而提高知识图谱的补全效果.

关键词: 知识图谱补全, 预训练语言模型, 对比学习, 图神经网络

Abstract:

Based upon path query information, we propose a graph attention model that effectively integrates textual and graph structure information in knowledge graphs, thereby enhancing knowledge graph completion. For textual information, a dual-encoder based on pre-trained language models is utilized to separately obtain embedding representations of entities and path query information. Additionally, an attention mechanism is employed to aggregate path query information, which is used to capture graph structural information and update entity embeddings. The model was trained using contrastive learning and experiments were conducted on multiple knowledge graph datasets, with good results achieved in both transductive and inductive settings. These results demonstrate the advantage of combining pre-trained language models with graph neural networks to effectively capture both textual and graph structural information, thereby enhancing knowledge graph completion.

Key words: knowledge graph completion, pre-trained language model, contrastive learning, graph neural networks

中图分类号: