数据驱动的计算教育学

面向初等数学的知识点关系提取研究

  • 杨东明 ,
  • 杨大为 ,
  • 顾航 ,
  • 洪道诚 ,
  • 高明 ,
  • 王晔
展开
  • 华东师范大学 数据科学与工程学院, 上海 200062
杨东明,男,硕士研究生,研究方向为面向新硬件的大数据系统.E-mail:y1271752959m2@yahoo.com.

收稿日期: 2019-07-29

  网络出版日期: 2019-10-11

基金资助

国家重点研发计划(2016YFB1000905);国家自然科学基金(U1811264,61672234,61502236,61877018,61977025);上海市科技兴农推广项目(T20170303)

Research on knowledge point relationship extraction for elementary mathematics

  • YANG Dong-ming ,
  • YANG Da-wei ,
  • GU Hang ,
  • HONG Dao-cheng ,
  • GAO Ming ,
  • WANG Ye
Expand
  • School of Data Science and Engineering, East China Normal University, Shanghai 200062, China

Received date: 2019-07-29

  Online published: 2019-10-11

摘要

随着互联网技术的发展,在线教育已经改变了学生的学习方式.但由于缺乏完整的知识体系,在线教育存在着智能化程度低和“信息迷航”的问题.因此,构建知识体系成为在线教育平台的核心技术.知识点间的关系提取是知识体系构建的主要任务之一,目前比较高效的关系提取算法主要是监督式的.但是这类方法受限于文本质量低、语料稀缺、标签数据难获取、特征工程效率低、难以提取有向关系等挑战.为此,基于百科语料和远程监督思想,研究了知识点间的关系提取算法.提出了基于关系表示的注意力机制,该方法能够提取知识点间的有向关系信息.结合了GCN和LSTM的优势,提出了GCLSTM,该模型更好地提取了句子中的多点信息.基于Transformer架构和关系表示的注意力机制,提出了适用于有向关系提取的BTRE模型,降低了模型的复杂度.设计并实现了知识点关系提取系统.通过设计3组对比实验,验证了模型的性能和效率.

本文引用格式

杨东明 , 杨大为 , 顾航 , 洪道诚 , 高明 , 王晔 . 面向初等数学的知识点关系提取研究[J]. 华东师范大学学报(自然科学版), 2019 , 2019(5) : 53 -65 . DOI: 10.3969/j.issn.1000-5641.2019.05.004

Abstract

With the development of Internet technology, online education has changed the learning style of students. However, given the lack of a complete knowledge system, online education has a low degree of intelligence and a/knowledge trek0problem. The relation-extraction concept is one of the key elements of knowledge system construction. Therefore, building knowledge systems has become the core technology of online education platforms. At present, the more efficient relationship extraction algorithms are usually supervised. However, such methods suffer from low text quality, scarcity of corpus, difficulty in labeling data, low efficiency of feature engineering, and difficulty in extracting directional relationships. Therefore, this paper studies the relation-extraction algorithm between concepts based on an encyclopedic corpus and distant supervision methods. An attention mechanism based on relational representation is proposed, which can extract the forward relationship information between knowledge points. Combining the advantages of GCN and LSTM, GCLSTM is proposed, which better extracts multipoint information in sentences. Based on the attention mechanism of Transform architecture and relational representation, a BTRE model suitable for the extraction of directional relationships is proposed, which reduces the complexity of the model. Hence, a knowledge point relationship extraction system is designed and implemented. The performance and efficiency of the model are verified by designing three sets of comparative experiments.

参考文献

[1] LIU H, MA W, YANG Y, et al. Learning concept graphs from online educational data[J]. Journal of Artificial Intelligence Research, 2016, 55:1059-1090.
[2] NOVAK J D, BOB GOWIN D, JOHANSEN G T. The use of concept mapping and knowledge vee mapping with junior high school science students[J]. Science education, 1983, 67(5):625-645.
[3] MIWA M, BANSAL M. End-to-End Relation Extraction using LSTMs on Sequencesand Tree Structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers), 2016:1105-1116.
[4] MINTZ M, BILIS S, SNOW R, et al. Distant supervision for relation extraction without labeled data[C]//Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP:Volume 2, Association for Computational Linguistics, 2009:1003-1011.
[5] ZHOU P, SHI W, TIAN J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2:Short Papers), 2016:207-212.
[6] LIN Y, SHEN S, LIU Z, et al. Neural relation extraction with selective attention over instances[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers), 2016:2124-2133.
[7] LUONG M T, PHAM H, MANNING C D. Effective approaches to attention-based neural machine translation[C]//Empirical Methods in Natural Language Processing, 2015:1412-1421.
[8] NGUYEN T H, GRISHMAN R. Relation extraction:Perspective from convolutional neural networks[C]//Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing, 2015:39-48.
[9] ZHANG D, WANG D. Relation classification via recurrent neural network[J]. CoRR abs/1508.01006, 2015.
[10] HASHIMOTO K, MIWA M, TSURUOKA Y, et al. Simple customization of recursive neural networks for semantic relation classification[C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, 2013:1372-1376.
[11] EBRAHIMI J, DOU D. Chain based RNN for relation classification[C]//Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, 2015:1244-1249.
[12] SUNDERMEYER M, SCHLÜTER R, NEY H. LSTM neural networks for language modeling[C]//Thirteenth annual conference of the international speech communication association, 2012.
[13] XU Y, MOU L, LI G, et al. Classifying relations via long short term memory networks along shortest dependency paths[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015:1785-1794.
[14] HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural computation, 1997, 9(8):1735-1780.
[15] ZENG D, LIU K, LAI S, et al. Relation classification via convolutional deep neural network[C]//25th International Conference on Computational Linguistics, Proceedings of the Conference:Technical Papers, August 23-29, 2014:2335-2344.
[16] DAUPHIN Y N, FAN A, AULI M, et al. Language modeling with gated convolutional networks[C]//Proceedings of the 34th International Conference on Machine Learning-Volume 70, 2017:933-941.
[17] KINCHIN I M, HAY D B, ADAMS A. How a qualitative approach to concept map analysis can be used to aid learning by illustrating patterns of conceptual development[J]. Educational research, 2000, 42(1):43-57.
[18] BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[C]//International Conference on Learning Representations, 2015.
[19] ZENG D, LIU K, CHEN Y, et al. Distant supervision for relation extraction via piecewise convolutional neural networks[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015:1753-1762.
[20] MNIH V, HEESS N, GRAVES A. Recurrent models of visual attention[C]//Advances in neural information processing systems, 2014:2204-2212.
文章导航

/