Toward Near Zero-Parameter Prediction Using a Computational Model of Student Learning
Proceedings of the 12th International Conference on Educational Data Mining
2019
Abstract
Computational models of learning can be powerful tools to test educational technologies, automate the authoring of instructional software, and advance theories of learning. These mechanistic models of learning, which instantiate computational theories of the learning process, are capable of making predictions about learners’ performance in instructional technologies given only the technology itself without fitting any parameters to existing learners’ data. While these so call “zero-parameter” models have been successful in modeling student learning in intelligent tutoring systems they still show systematic deviation from human learning performance. One deviation stems from the computational models’ lack of prior knowledge—all models start off as a blank slate—leading to substantial differences in performance at the first practice opportunity. In this paper, we explore three different strategies for accounting for prior knowledge within computational models of learning and the effect of these strategies on the predictive accuracy of these models.
BibTeX
@inproceedings{weitekamp-edm-2019,
title = {Toward Near Zero-Parameter Prediction Using a Computational Model of Student Learning},
author = {Weitekamp, Daniel and Harpstead, Erik and Rachatasumrit, Napol and MacLellan, Christopher J. and Koedinger, Kenneth R.},
booktitle = {Proceedings of the 12th International Conference on Educational Data Mining},
pages = {456-461},
year = {2019},
}
