Abstract

Knowledge Tracing is a machine learning technique that aids in monitoring a learner’s knowledge in a given learning environment. This thesis introduces a unified approach to adaptive learning, specifically with the use of Knowledge Tracing (KT) and other relevant machine learning models, that can aid in creating a personalized learning environment for a learner on an Intelligent Tutoring System (ITS). While contemporary ITS exemplars, such as Knewton and Adaptemy, strive to provide seamless adaptive learning experience, most such ITS platforms do so by employing individualized models, each tai- lored to distinct tasks, for example, some models may predict correctness of students’ next responses, while others track progress in skill attainment by tracing interactions within the ITS. However, the development of these models necessitates extensive experimentation to optimize input attributes for practical application, alongside the exploration of optimal model architectures. In addition, certain Knowledge Tracing (KT) models, such as RA-BKT and RA-ANN, incorporate metacognitive inputs to enhance the prediction ac- curacy of student responses. Conversely, models like Deep knowledge tracing with transformers (DKTT) emphasize treating data as a sequence-to-sequence problem and prioritize attributes such as time to improve correctness predic- tion. However, the multitude of available models underscores the challenge of determining the optimal model for specific use cases, particularly regarding input attributes availability/requirements posed by ITS. To address this issue, this thesis examines the development of a comprehensive framework named: ”BERT-Boosted Knowledge Tracing”, which will be able to deal with mul- titude of KT datasets and spawn off KT or other relevant adaptive learning models as output through its unified architecture. This framework could be imagined as a tool that an ITS can use to readily train required KT models and the framework could be set to train models that would be most relevant to a specific ITS use. The focus in this study would be to explore the power of BERT to create such models and to be at the core of such a framework. All experiments set would be through an example framework which would warrant the use of metacognitive inputs to train KT models. This is done to help in assessing the model’s performance against other such existing models (RA-BKT and RA-ANN) and also to emulate few of the scenarios where hav- ing such a framework at disposal could be of great help to create an ensemble of relevant models that can aid in creating an Adaptive Learning experience in an ITS. Creating such a framework and the resultant models poses the challenge of establishing robust evaluation criteria, given the scarcity of comparable works. To address this, several models are trained based on three distinct datasets, with an aim to showcase such a framework’s capabilities in: being able to handle different kind of input attributes, training models based on uniquely specified target attributes, creating models that can be trained on very less training data and also to create models that are domain adaptive. Through rigorous experimentation and evaluation, this thesis showcases the efficacy of the proposed framework in revolutionizing KT model development. By providing a comprehensive solution that seamlessly integrates BERT-based contextual learning with adaptable training methodologies and the incorporation of metacognitive insights, this research significantly advances the field of Knowledge Tracing.

Library of Congress Subject Headings

Intelligent tutoring systems; Learning--Evaluation; Machine learning--Management

Publication Date

6-20-2024

Document Type

Thesis

Student Type

Graduate

Degree Name

Computer Science (MS)

Department, Program, or Center

Computer Science, Department of

College

Golisano College of Computing and Information Sciences

Advisor

Zachary Butler

Advisor/Committee Member

Carlos R. Rivero

Advisor/Committee Member

Xumin Liu

Campus

RIT – Main Campus

Plan Codes

COMPSCI-MS

Share

COinS