• search hit 1 of 4
Back to Result List

From the lab to the wild: examining generalizability of video-based mind wandering detection

  • Student’s shift of attention away from a current learning task to task-unrelated thought, also called mind wandering, occurs about 30% of the time spent on education-related activities. Its frequent occurrence has a negative effect on learning outcomes across learning tasks. Automated detection of mind wandering might offer an opportunity to assess the attentional state continuously and non-intrusively over time and hence enable large-scale research on learning materials and responding to inattention with targeted interventions. To achieve this, an accessible detection approach that performs well for various systems and settings is required. In this work, we explore a new, generalizable approach to video-based mind wandering detection that can be transferred to naturalistic settings across learning tasks. Therefore, we leverage two datasets, consisting of facial videos during reading in the lab (N = 135) and lecture viewing in-the-wild (N = 15). When predicting mind wandering, deepStudent’s shift of attention away from a current learning task to task-unrelated thought, also called mind wandering, occurs about 30% of the time spent on education-related activities. Its frequent occurrence has a negative effect on learning outcomes across learning tasks. Automated detection of mind wandering might offer an opportunity to assess the attentional state continuously and non-intrusively over time and hence enable large-scale research on learning materials and responding to inattention with targeted interventions. To achieve this, an accessible detection approach that performs well for various systems and settings is required. In this work, we explore a new, generalizable approach to video-based mind wandering detection that can be transferred to naturalistic settings across learning tasks. Therefore, we leverage two datasets, consisting of facial videos during reading in the lab (N = 135) and lecture viewing in-the-wild (N = 15). When predicting mind wandering, deep neural networks (DNN) and long short-term memory networks (LSTMs) achieve F1 scores of 0.44 (AUC-PR = 0.40) and 0.459 (AUC-PR = 0.39), above chance level, with latent features based on transfer-learning on the lab data. When exploring generalizability by training on the lab dataset and predicting on the in-the-wild dataset, BiLSTMs on latent features perform comparably to the state-of-the-art with an F1 score of 0.352 (AUC-PR = 0.26). Moreover, we investigate the fairness of predictive models across gender and show based on post-hoc explainability methods that employed latent features mainly encode information on eye and mouth areas. We discuss the benefits of generalizability and possible applications.show moreshow less

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Babette Bühler, Efe Bozkir, Patricia Goldberg, Ömer SümerORCiDGND, Sidney D'Mello, Peter Gerjets, Ulrich Trautwein, Enkelejda Kasneci
URN:urn:nbn:de:bvb:384-opus4-1138145
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/113814
ISSN:1560-4292OPAC
ISSN:1560-4306OPAC
Parent Title (English):International Journal of Artificial Intelligence in Education
Publisher:Springer Science and Business Media LLC
Type:Article
Language:English
Year of first Publication:2025
Publishing Institution:Universität Augsburg
Release Date:2024/07/03
Volume:35
First Page:823
Last Page:857
DOI:https://doi.org/10.1007/s40593-024-00412-2
Institutes:Fakultät für Angewandte Informatik
Fakultät für Angewandte Informatik / Institut für Informatik
Fakultät für Angewandte Informatik / Institut für Informatik / Lehrstuhl für Menschzentrierte Künstliche Intelligenz
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Licence (German):License LogoCC-BY 4.0: Creative Commons: Namensnennung (mit Print on Demand)