[1] LIU H, MA W, YANG Y, et al. Learning concept graphs from online educational data[J]. Journal of Artificial Intelligence Research, 2016, 55:1059-1090. [2] NOVAK J D, BOB GOWIN D, JOHANSEN G T. The use of concept mapping and knowledge vee mapping with junior high school science students[J]. Science education, 1983, 67(5):625-645. [3] MIWA M, BANSAL M. End-to-End Relation Extraction using LSTMs on Sequencesand Tree Structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers), 2016:1105-1116. [4] MINTZ M, BILIS S, SNOW R, et al. Distant supervision for relation extraction without labeled data[C]//Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP:Volume 2, Association for Computational Linguistics, 2009:1003-1011. [5] ZHOU P, SHI W, TIAN J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2:Short Papers), 2016:207-212. [6] LIN Y, SHEN S, LIU Z, et al. Neural relation extraction with selective attention over instances[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers), 2016:2124-2133. [7] LUONG M T, PHAM H, MANNING C D. Effective approaches to attention-based neural machine translation[C]//Empirical Methods in Natural Language Processing, 2015:1412-1421. [8] NGUYEN T H, GRISHMAN R. Relation extraction:Perspective from convolutional neural networks[C]//Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing, 2015:39-48. [9] ZHANG D, WANG D. Relation classification via recurrent neural network[J]. CoRR abs/1508.01006, 2015. [10] HASHIMOTO K, MIWA M, TSURUOKA Y, et al. Simple customization of recursive neural networks for semantic relation classification[C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, 2013:1372-1376. [11] EBRAHIMI J, DOU D. Chain based RNN for relation classification[C]//Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, 2015:1244-1249. [12] SUNDERMEYER M, SCHLÜTER R, NEY H. LSTM neural networks for language modeling[C]//Thirteenth annual conference of the international speech communication association, 2012. [13] XU Y, MOU L, LI G, et al. Classifying relations via long short term memory networks along shortest dependency paths[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015:1785-1794. [14] HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural computation, 1997, 9(8):1735-1780. [15] ZENG D, LIU K, LAI S, et al. Relation classification via convolutional deep neural network[C]//25th International Conference on Computational Linguistics, Proceedings of the Conference:Technical Papers, August 23-29, 2014:2335-2344. [16] DAUPHIN Y N, FAN A, AULI M, et al. Language modeling with gated convolutional networks[C]//Proceedings of the 34th International Conference on Machine Learning-Volume 70, 2017:933-941. [17] KINCHIN I M, HAY D B, ADAMS A. How a qualitative approach to concept map analysis can be used to aid learning by illustrating patterns of conceptual development[J]. Educational research, 2000, 42(1):43-57. [18] BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[C]//International Conference on Learning Representations, 2015. [19] ZENG D, LIU K, CHEN Y, et al. Distant supervision for relation extraction via piecewise convolutional neural networks[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015:1753-1762. [20] MNIH V, HEESS N, GRAVES A. Recurrent models of visual attention[C]//Advances in neural information processing systems, 2014:2204-2212. |