Confidence-based assessment is a two-dimensional assessment paradigm which considers the confidence or expectancy level a student has about the answer, to ascertain his/her actual knowledge. Several researchers have discussed the usefulness of this model over the traditional one-dimensional assessment approach, which takes the number of correctly answered questions as a sole parameter to calculate the test scores of a student. Additionally, some educational psychologists and theorists have found that confidence-based assessment has a positive impact on students’ academic performance, knowledge retention, and metacognitive abilities of self-regulation and engagement depicted during a learning process. However, to the best of our knowledge, these findings are not exploited by the educational data mining community, aiming to exploit students (logged) data to investigate their performance and behavioral characteristics in order to enhance their performance outcomes and/or learning experiences. Engagement reflects a student’s active participation in an ongoing task or process, that becomes even more important when students are interacting with a computer-based learning or assessment system. There is some evidence that students’ online engagement (which is estimated through their behaviors while interacting with a learning/assessment environment) is also positively correlated with good performance scores. However, no data mining method to date has measured students engagement behaviors during confidence-based assessment. This Ph.D. research work aimed to identify, analyze, model and predict students’ dynamic behaviors triggered by their progression in a computer-based assessment system, offering confidence-driven questions. The data was collected from two experimental studies conducted with undergraduate students who solved a number of problems during confidence-based assessment. In this thesis, we first addressed the challenge of identifying different parameters representing students’ problem-solving behaviors that are positively correlated with confidence-based assessment. Next, we developed a novel scheme to classify students’ problem-solving activities into engaged or disengaged behaviors using the three previously identified parameters namely: students’ response correctness, confidence level, feedback seeking/no-seeking behavior. Our next challenge was to exploit the students’ interactions recorded at the micro-level, i.e. event by event, by the computer-based assessment tools, to estimate their intended engagement behaviors during the assessment. We also observed that traditional non-mixture, first-order Markov chain is inadequate to capture students’ evolving behaviors revealed from their interactions with a computer-based learning/assessment system. We, therefore, investigated mixture Markov models to map students trails of performed activities. However, the quality of the resultant Markov chains is critically dependent on the initialization of the algorithm, which is usually performed randomly. We proposed a new approach for initializing the Expectation-Maximization algorithm for multivariate categorical data we called K-EM. Our method achieved better prediction accuracy and convergence rate in contrast to two pre-existing algorithms when applied on two real datasets. This doctoral research work contributes to elevate the existing states of the educational research (i.e. theoretical aspect) and the educational data mining area (i.e. empirical aspect). The outcomes of this work pave the way to a framework for an adaptive confidence-based assessment system, contributing to one of the central components of Adaptive Learning, that is, personalized student models. The adaptive system can exploit data generated in a confidence-based assessment system, to model students’ behavioral profiles and provide personalized feedback to improve students’ confidence accuracy and knowledge by considering their behavioral dynamics.

ANALYZING AND MODELING STUDENTS¿ BEHAVIORAL DYNAMICS IN CONFIDENCE-BASED ASSESSMENT

MAQSOOD, RABIA
2020

Abstract

Confidence-based assessment is a two-dimensional assessment paradigm which considers the confidence or expectancy level a student has about the answer, to ascertain his/her actual knowledge. Several researchers have discussed the usefulness of this model over the traditional one-dimensional assessment approach, which takes the number of correctly answered questions as a sole parameter to calculate the test scores of a student. Additionally, some educational psychologists and theorists have found that confidence-based assessment has a positive impact on students’ academic performance, knowledge retention, and metacognitive abilities of self-regulation and engagement depicted during a learning process. However, to the best of our knowledge, these findings are not exploited by the educational data mining community, aiming to exploit students (logged) data to investigate their performance and behavioral characteristics in order to enhance their performance outcomes and/or learning experiences. Engagement reflects a student’s active participation in an ongoing task or process, that becomes even more important when students are interacting with a computer-based learning or assessment system. There is some evidence that students’ online engagement (which is estimated through their behaviors while interacting with a learning/assessment environment) is also positively correlated with good performance scores. However, no data mining method to date has measured students engagement behaviors during confidence-based assessment. This Ph.D. research work aimed to identify, analyze, model and predict students’ dynamic behaviors triggered by their progression in a computer-based assessment system, offering confidence-driven questions. The data was collected from two experimental studies conducted with undergraduate students who solved a number of problems during confidence-based assessment. In this thesis, we first addressed the challenge of identifying different parameters representing students’ problem-solving behaviors that are positively correlated with confidence-based assessment. Next, we developed a novel scheme to classify students’ problem-solving activities into engaged or disengaged behaviors using the three previously identified parameters namely: students’ response correctness, confidence level, feedback seeking/no-seeking behavior. Our next challenge was to exploit the students’ interactions recorded at the micro-level, i.e. event by event, by the computer-based assessment tools, to estimate their intended engagement behaviors during the assessment. We also observed that traditional non-mixture, first-order Markov chain is inadequate to capture students’ evolving behaviors revealed from their interactions with a computer-based learning/assessment system. We, therefore, investigated mixture Markov models to map students trails of performed activities. However, the quality of the resultant Markov chains is critically dependent on the initialization of the algorithm, which is usually performed randomly. We proposed a new approach for initializing the Expectation-Maximization algorithm for multivariate categorical data we called K-EM. Our method achieved better prediction accuracy and convergence rate in contrast to two pre-existing algorithms when applied on two real datasets. This doctoral research work contributes to elevate the existing states of the educational research (i.e. theoretical aspect) and the educational data mining area (i.e. empirical aspect). The outcomes of this work pave the way to a framework for an adaptive confidence-based assessment system, contributing to one of the central components of Adaptive Learning, that is, personalized student models. The adaptive system can exploit data generated in a confidence-based assessment system, to model students’ behavioral profiles and provide personalized feedback to improve students’ confidence accuracy and knowledge by considering their behavioral dynamics.
31-gen-2020
Inglese
Student engagement; confidence-based assessment; Mixture Markov Model; engaged/disengaged behaviors; K-means clustering; multivariate categorical time-series
CERAVOLO, PAOLO
BOLDI, PAOLO
CERAVOLO, PAOLO
Università degli Studi di Milano
File in questo prodotto:
File Dimensione Formato  
phd_unimi_R11621.pdf

accesso aperto

Dimensione 3.91 MB
Formato Adobe PDF
3.91 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/81556
Il codice NBN di questa tesi è URN:NBN:IT:UNIMI-81556