TY - JOUR
T1 - Cognitive and emotional engagement while learning with VR
T2 - The perspective of multimodal methodology
AU - Dubovi, Ilana
N1 - Publisher Copyright:
© 2022 The Author
PY - 2022/7
Y1 - 2022/7
N2 - This study uses a multimodal data analysis approach to provide a more continuous and objective insight into how students' engagement unfolds and impacts learning achievements. In this study, 61 nursing students' learning processes with a virtual reality (VR)-based simulation were captured by psycho-physiological data streams of facial expression, eye-tracking, and electrodermal activity (EDA) sensors, as well as by subjective self-reports. Students’ learning achievements were evaluated by a pre- and post-test content knowledge test. Overall, while both facial expression and self-report modalities revealed that students experienced significantly higher levels of positive than negative emotions, only the facial expression data channel was able to detect fluctuations in engagement during the different learning session phases. Findings point towards the VR procedural learning phase as a reengaging learning activity, which induces more facial expressions of joy and triggers a higher mental effort as measured by eye tracking and EDA metrics. Most importantly, a regression analysis demonstrated that the combination of modalities explained 51% of post-test knowledge achievements. Specifically, higher levels of prior knowledge and self-reported enthusiasm, and lower levels of angry facial expressions, blink rate, and devotion of visual fixations to irrelevant information, were associated with higher achievements. This study demonstrates that the methodology of using multimodal data channels encompassing different types of objective and subjective measures, can provide insights into a more holistic understanding of engagement in learning and learning achievements.
AB - This study uses a multimodal data analysis approach to provide a more continuous and objective insight into how students' engagement unfolds and impacts learning achievements. In this study, 61 nursing students' learning processes with a virtual reality (VR)-based simulation were captured by psycho-physiological data streams of facial expression, eye-tracking, and electrodermal activity (EDA) sensors, as well as by subjective self-reports. Students’ learning achievements were evaluated by a pre- and post-test content knowledge test. Overall, while both facial expression and self-report modalities revealed that students experienced significantly higher levels of positive than negative emotions, only the facial expression data channel was able to detect fluctuations in engagement during the different learning session phases. Findings point towards the VR procedural learning phase as a reengaging learning activity, which induces more facial expressions of joy and triggers a higher mental effort as measured by eye tracking and EDA metrics. Most importantly, a regression analysis demonstrated that the combination of modalities explained 51% of post-test knowledge achievements. Specifically, higher levels of prior knowledge and self-reported enthusiasm, and lower levels of angry facial expressions, blink rate, and devotion of visual fixations to irrelevant information, were associated with higher achievements. This study demonstrates that the methodology of using multimodal data channels encompassing different types of objective and subjective measures, can provide insights into a more holistic understanding of engagement in learning and learning achievements.
KW - Electrodermal activity
KW - Engagement
KW - Eye-tracking
KW - Facial expression
KW - Virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85126294832&partnerID=8YFLogxK
U2 - 10.1016/j.compedu.2022.104495
DO - 10.1016/j.compedu.2022.104495
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85126294832
SN - 0360-1315
VL - 183
JO - Computers and Education
JF - Computers and Education
M1 - 104495
ER -