Data-driven Computational Education

A review of knowledge tracking

  • LIU Heng-yu ,
  • ZHANG Tian-cheng ,
  • WU Pei-wen ,
  • YU Ge
Expand
  • Computer Science and Engineering, Northeastern University, Shenyang 110819, China

Received date: 2019-07-29

  Online published: 2019-10-11

Abstract

In the field of education, scientifically and purposefully tracking the progression of student knowledge is a topic of great significance. With a student's historical learning trajectory and a model for the interaction process between students and exercises, knowledge tracking can automatically track the progression of a student's learning at each stage. This provides a technical basis for predicting student performance and achieving personalized guidance and adaptive learning. This paper first introduces the background of knowledge tracking and summarizes the pedagogy and data mining theory involved in knowledge tracking. Then, the paper summarizes the research status of knowledge tracking based on probability graphs, matrix factorization, and deep learning; we use these tools to classify the tracking methods according to different characteristics. Finally, the paper analyzes and compares the latest knowledge tracking technologies, and looks ahead to the future direction of ongoing research.

Cite this article

LIU Heng-yu , ZHANG Tian-cheng , WU Pei-wen , YU Ge . A review of knowledge tracking[J]. Journal of East China Normal University(Natural Science), 2019 , 2019(5) : 1 -15 . DOI: 10.3969/j.issn.1000-5641.2019.05.001

References

[1] YUDELSON M V, KOEDINGER K R, GORDON G J. Individualized Bayesian knowledge tracing models[C]//Artificial Intelligence in Education. Springer Berlin Heidelberg, 2013.
[2] SCHUSTER M, PALIWAL K. Bidirectional recurrent neural networks[J]. IEEE Transactions on Signal Processing, 1997, 45(11):2673-2681.
[3] PIECH C, BASSEN J, HUANG J, et al. Deep knowledge tracing[C]//NIPS, 2015:505-513.
[4] DIBELLO L, ROUSSOS L, STOUT W. 31a review of cognitively diagnostic assessment and a summary of psychometric models[J]. Handbook of Statistics, 2006, 26(12):979-1030.
[5] EMBRETSON S E, REISE S P. Item response theory[M].[S.l.]:Psychology Press, 2013.
[6] TORRE D L. DINA model and parameter estimation:A didactic[J]. Journal of Educational and Behavioral Statistics, 2008, 34(1):115-130.
[7] WU R, LIU Q, LIU Y, et al. Cognitive modelling for predicting examinee performance[C]//International Conference on Artificial Intelligence. AAAI Press, 2015.
[8] THAINGHE N, HORVATH T, SCHMIDTTHIEME L, et al. Factorization models for forecasting student performance[C]//Educational Data Mining, 2011:11-20.
[9] XIONG L, CHEN X, HUANG T K, et al. Temporal collaborative filtering with Bayesian probabilistic tensor factorization[C]//Proceedings of the 2010 SIAM International Conference on Data Mining, 2010:211-222
[10] SU Y, LIU Q, LIU Q, et al. Exercise-enhanced sequential modeling for student performance prediction[C]//AAAI, 2018:2435-2443.
[11] CHEN Y, LIU Q, HUANG Z, et al. Tracking knowledge proficiency of students with educational priors[C]//CIKM. ACM, 2017:989-998.
[12] KINGMA D, BA J. Adam:A method for stochastic optimization[C]//International Conference on Learning Representations, 2015.
[13] SHI Y, PENG Z, WANG H. Modeling student learning styles in MOOCs[C]//Proceedings of the 2017 ACM on Conference on Information and Knowledge Management. ACM, 2017:979-988.
[14] GRAVES A, MOHAMED A, HINTON G E. Speech recognition with deep recurrent neural networks[C]//ICASSP. IEEE, 2013:6645-6649.
[15] KRIZHEVSKY A, SUTSKEVER I, HINTON G. ImageNet Classification with Deep Convolutional Neural Networks[C]//NIPS. Curran Associates Inc, 2012.
[16] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]//Advances in Neural Information Processing Systems, 2013.
[17] HUANG Z, LIU Q, CHEN E, et al. Question difficulty prediction for reading problems in standard tests[C]//National Conference on Artificial Intelligence, 2017:1352-1359.
[18] ZHANG J, SHI X, KING I, et al. Dynamic key-value memory networks for knowledge tracing[C]//The Web Conference, ACM, 2017:765-774.
[19] CHEN P, LU Y, ZHENG V W, et al. Prerequisite-driven deep knowledge tracing[C]//IEEE Computer Society. ICDM, 2018:39-48.
[20] DE BAKER R S J, CORBETT A T, ALEVEN V. More accurate student modeling through contextual estimation of slip and guess probabilities in Bayesian knowledge tracing[C]//Proceedings of the 9th international conference on Intelligent Tutoring Systems. Springer-Verlag, 1970.
[21] PARDOS Z A, HEFFERNAN N T. KT-IDEM:Introducing item difficulty to the knowledge tracing model[C]//International Conference on User Modeling Adaptation and Personalization, 2011:243-254.
[22] YUDELSON M, KOEDINGER K R, GORDON G J, et al. Individualized Bayesian knowledge tracing models[C]//Artificial Intelligence in Education, 2013:171-180.
[23] MNIH A, SALAKHUTDINOV R. Probabilistic matrix factorization[C]//Neural Information Processing Systems, 2007:1257-1264.
[24] CORBETT A T, ANDERSON J R. Knowledge tracing:Modeling the acquisition of procedural knowledge[J]. User Modeling and User-Adapted Interaction, 1994, 4(4):253-278.
[25] PARDOS Z, HEFFERNAN N, RUIZ C, et al. The composition effect:Conjuntive or compensatory?an analysis of multi-skill math questions in ITS[C]//Educational Data Mining, 2008:147-156.
[26] HASTINGS W K. Monte Carlo Sampling Methods Using Markov Chains and Their Applications[J]. Biometrika, 1970, 57(1):97-109.
[27] CAMILLI, G. Teacher's Corner:Origin of the scaling constant d=1.7 in item response theory[J]. Journal of Educational Statistics, 1994:19(3), 293-295.
[28] RAJU N S, SLINDE J. ISSUES IN ITEM BANKING[J]. Journal of Educational Measurement, 1984, 21(4):415-417.
[29] TANG J, GAO H, HU X, et al. Exploiting homophily effect for trust prediction[C]//Proceedings of the sixth ACM international conference on Web search and data mining. ACM, 2013:53-62.
[30] HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8):1735-1780.
Outlines

/