Educational Knowledge Graphs and Large Language Models

Educational resource content review method based on knowledge graph and large language model collaboration

  • Jia LIU ,
  • Xin SUN ,
  • Yuqing ZHANG
Expand
  • 1. Center for Educational Technology and Resource Development, Ministry of Education P. R. China (National Center for Educational Technology, NCET), Beijing 100031, China
    2. School of Computer Science & Technology, Beijing Institute of Technology, Beijing 100081, China

Received date: 2024-07-11

  Accepted date: 2024-05-30

  Online published: 2024-09-23

Abstract

Automated content reviews on digital educational resources are urgently in demand in the educational informatization era. Especially in the applicability review of whether educational resources exceed the standard, there are problems with knowledge which are easy to exceed national curriculum standards and difficult to locate. In response to this demand, this study proposed a review method for educational resources based on the collaboration of an educational knowledge graph and a large language model . Specifically, this study initially utilized the ontology concept to design and construct a knowledge graph for curriculum education in primary and secondary schools. A knowledge localization method was subsequently designed based on teaching content generation, sorting, and pruning, by utilizing the advantages of large language models for text generation and sorting tasks. Finally, by detecting conflicts between the core knowledge sub-graph of teaching content and the knowledge graph teaching path, the goal of recognizing teaching content that exceeded the national standard was achieved. Experimental results demonstrate that the proposed method effectively addresses the task of reviewing exceptional standard knowledge in educational resource content. This opens up a new technological direction for educational application based on the knowledge graph and large language model collaboration.

Cite this article

Jia LIU , Xin SUN , Yuqing ZHANG . Educational resource content review method based on knowledge graph and large language model collaboration[J]. Journal of East China Normal University(Natural Science), 2024 , 2024(5) : 57 -69 . DOI: 10.3969/j.issn.1000-5641.2024.05.006

References

1 教育部办公厅. 国家智慧教育平台数字教育资源内容审核规范(试行) [A/OL]. (2022-05-26) [2024-05-22]. http://www.moe.gov.cn/srcsite/A16/s3342/202211/t20221108_979699.html.
2 邓志鸿, 唐世渭, 张铭, 等.. Ontology研究综述. 北京大学学报(自然科学版), 2002, (5): 038.
3 CHEN X, JIA S, XIANG Y.. A review: Knowledge reasoning over knowledge graph. Expert Systems with Applications, 2020, 141, 112948.
4 沈红叶, 肖婉, 季一木, 等.. 教育知识图谱的类型、应用及挑战. 软件导刊, 2023, 22 (10): 237- 243.
5 李艳燕, 张香玲, 李新, 等.. 面向智慧教育的学科知识图谱构建与创新应用. 电化教育研究, 2019, 40 (8): 60- 69.
6 马富龙, 张泽琳, 闫燕.. 学科知识图谱: 内涵、技术架构、应用与发展趋势. 软件导刊, 2024, 23 (3): 212- 220.
7 ZAREMBA W, SUTSKEVER I, VINYALS O. Recurrent neural network regularization [EB/OL]. (2014-09-08)[2024-05-22]. https://arxiv.org/pdf/1409.2329.
8 GRAVES A. Long short-term memory [J]. Supervised Sequence Labelling with Recurrent Neural Networks, 2012: 37-45.
9 ILI? S, MARRESE-TAYLOR E, BALAZS J A, et al. Deep contextualized word representations for detecting sarcasm and irony [C]// ALEXANDRA B, SAIF M M, VERONIQUE H, et al. Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis. Brussels: Association for Computational Linguistics, 2018: 2-7.
10 VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [EB/OL]. (2017-06-12)[2024-05-22]. https://arxiv.org/pdf/1706.03762.
11 DEVLIN J, CHANG M W, LEE K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding [EB/OL]. (2018-10-11)[2024-05-22]. https://arxiv.org/pdf/1810.04805.
12 FEDUS W, ZOPH B, SHAZEER N.. Switch transformers: Scaling to trillion parameter models with simple and efficient sparsity. Journal of Machine Learning Research, 2022, 23 (120): 1- 39.
13 CHOWDHERY A, NARANG S, DEVLIN J, et al.. Palm: Scaling language modeling with pathways. Journal of Machine Learning Research, 2023, 24 (240): 1- 113.
14 RAE J W, BORGEAUD S, CAI T, et al. Scaling language models: Methods, analysis & insights from training gopher [EB/OL]. (2021-12-02)[2024-05-22]. https://arxiv.org/pdf/2112.11446.
15 黄勃, 吴申奥, 王文广, 等. 图模互补: 知识图谱与大模型融合综述 [J/OL]. (2024-05-17)[2024-05-29]. 武汉大学学报(理学版), 2024: 397-412. https://doi.org/10.14188/j.1671-8836.2024.0040.
16 LEI Y , UREN V , MOTTA E . SemSearch: A search engine for the semantic web [M]// STEFFEN S, VOJTěCH S. Managing Knowledge in a World of Networks. Berlin: Springer, 2006: 238-245.
17 季慧娟. K12教育知识图谱管理系统设计与实现 [D]. 武汉: 华中师范大学, 2021.
18 付雷杰, 曹岩, 白瑀, 等.. 国内垂直领域知识图谱发展现状与展望. 计算机应用研究, 2021, 38 (11): 3201- 3214.
19 DU Z, QIAN Y, LIU X, et al. Glm: General language model pretraining with autoregressive blank infilling [C]// SMARANDA M, PRESLAV N, ALINE V. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Dublin: Association for Computational Linguistics. 2021: 320–335.
20 ACHIAM J, ADLER S, AGARWAL S, et al. Gpt-4 technical report [EB/OL]. (2023-03-15)[2024-06-30]. https://arxiv.org/pdf/2303.08774.
Outlines

/