• search hit 2 of 3930
Back to Result List

Application of multimodal self-supervised architectures for daily life affect recognition

  • The recognition of affects (an umbrella term including but not limited to emotions, mood, and stress) in daily life is crucial for maintaining mental well-being and preventing long-term health issues. Wearable devices, such as smart bands, can collect physiological data including heart rate variability, electrodermal activity, skin temperature, and acceleration facilitating daily life affect monitoring via machine learning models. However, accurately labeling this data for model evaluation is challenging in affective computing research, as individuals often provide subjective, inaccurate, or incomplete labels in their daily lives. This study introduces the adaptation of self-supervised learning architectures for multimodal daily life stress and emotion recognition tasks, focusing on self-representation and contrastive learning methods. By leveraging unlabeled multimodal physiological signals, we aim to alleviate the need for extensive labeled data and enhance model generalizability.The recognition of affects (an umbrella term including but not limited to emotions, mood, and stress) in daily life is crucial for maintaining mental well-being and preventing long-term health issues. Wearable devices, such as smart bands, can collect physiological data including heart rate variability, electrodermal activity, skin temperature, and acceleration facilitating daily life affect monitoring via machine learning models. However, accurately labeling this data for model evaluation is challenging in affective computing research, as individuals often provide subjective, inaccurate, or incomplete labels in their daily lives. This study introduces the adaptation of self-supervised learning architectures for multimodal daily life stress and emotion recognition tasks, focusing on self-representation and contrastive learning methods. By leveraging unlabeled multimodal physiological signals, we aim to alleviate the need for extensive labeled data and enhance model generalizability. Our research demonstrates that self-supervised learning can effectively learn meaningful representations from physiological data without explicit labels, offering a promising approach for developing robust affect recognition systems that can operate in dynamic and uncontrolled environments. This work represents a significant improvement in recognizing affects in the wild, with potential implications for personalized mental health support and timely interventions.show moreshow less

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Yekta Said CanORCiDGND, Mohamed BenouisORCiDGND, Bhargavi MaheshORCiD, Elisabeth AndréORCiDGND
URN:urn:nbn:de:bvb:384-opus4-1254801
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/125480
ISSN:1949-3045OPAC
ISSN:2371-9850OPAC
Parent Title (English):IEEE Transactions on Affective Computing
Publisher:Institute of Electrical and Electronics Engineers (IEEE)
Place of publication:New York, NY
Type:Article
Language:English
Year of first Publication:2025
Publishing Institution:Universität Augsburg
Release Date:2025/09/24
Volume:16
Issue:3
First Page:2454
Last Page:2465
DOI:https://doi.org/10.1109/taffc.2025.3562552
Institutes:Fakultät für Angewandte Informatik
Fakultät für Angewandte Informatik / Institut für Informatik
Fakultät für Angewandte Informatik / Institut für Informatik / Lehrstuhl für Menschzentrierte Künstliche Intelligenz
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Licence (German):CC-BY 4.0: Creative Commons: Namensnennung (mit Print on Demand)