Journal of East China Normal University(Natural Science) ›› 2024, Vol. 2024 ›› Issue (5): 20-31.doi: 10.3969/j.issn.1000-5641.2024.05.003

• Learning Assessment and Recommendation • Previous Articles     Next Articles

SA-MGKT: Multi-graph knowledge tracing method based on self-attention

Chang WANG1,2, Dan MA1,2,*(), Huarong XU1,2, Panfeng CHEN1,2, Mei CHEN1,2, Hui LI1,2   

  1. 1. State Key Laboratory of Public Big Data, Guiyang 550025, China
    2. School of Computer Science and Technology, Guizhou University, Guiyang 550025, China
  • Received:2024-07-12 Online:2024-09-25 Published:2024-09-23
  • Contact: Dan MA E-mail:dma@gzu.edu.cn

Abstract:

This study proposes a multi-graph knowledge tracing method integrated with a self-attention mechanism (SA-MGKT), The aim is to model students’ knowledge mastery based on their historical performance on problem-solving exercises and evaluate their future learning performance. Firstly, a heterogeneous graph of student-exercise is constructed to represent the high-order relationships between these two factors. Graph contrastive learning techniques are employed to capture students’ answer preferences, and a three-layer LightGCN is utilized for graph representation learning. Secondly, we introduce information from concept association hypergraphs and directed transition graphs, and obtain node embeddings through hypergraph convolutional networks and directed graph convolutional networks. Finally, by incorporating the self-attention mechanism, we successfully fuse the internal information within the exercise sequence and the latent knowledge embedded in the representations learned from multiple graphs, leading to a substantial enhancement in the accuracy of the knowledge tracing model. Experimental outcomes on three benchmark datasets demonstrate promising results, showcasing remarkable improvements of 3.51%, 17.91%, and 1.47% respectively in the evaluation metrics, compared to the baseline models. These findings robustly validate the effectiveness of integrating multi-graph information and the self-attention mechanism in enhancing the performance of knowledge tracing models.

Key words: knowledge tracing, graph contrastive learning, self-attention

CLC Number: