In the face and heart of data scarcity in Industry 5.0: exploring applicability of facial and physiological AI models for operator well-being in human-robot collaboration
- Over the past decade, research has focused on integrating collaborative robots, or cobots, into assembly lines. The envisioned future industrial workplaces involve close collaboration between human workers and cobots. With the advent of Industry 5.0, human-centered approaches to facilitate human-robot collaboration (HRC) have gained significant traction. These approaches go beyond ensuring physical safety, emphasizing the mental health and well-being of industrial workers. To achieve this goal, cobots have to be equipped with capabilities to detect real-time worker states. Despite various investigations into user states related to well-being in different domains, the manifestations of these states in industrial settings are relatively unexplored. Hence, a critical gap exists in our understanding of whether machine learning models developed for other contexts are applicable to industrial HRC.
Many aspects of existing datasets pose challenges to the applicability of the machineOver the past decade, research has focused on integrating collaborative robots, or cobots, into assembly lines. The envisioned future industrial workplaces involve close collaboration between human workers and cobots. With the advent of Industry 5.0, human-centered approaches to facilitate human-robot collaboration (HRC) have gained significant traction. These approaches go beyond ensuring physical safety, emphasizing the mental health and well-being of industrial workers. To achieve this goal, cobots have to be equipped with capabilities to detect real-time worker states. Despite various investigations into user states related to well-being in different domains, the manifestations of these states in industrial settings are relatively unexplored. Hence, a critical gap exists in our understanding of whether machine learning models developed for other contexts are applicable to industrial HRC.
Many aspects of existing datasets pose challenges to the applicability of the machine learning models in industrial settings. On the one hand, most datasets for well-being-related states (e.g., pain, distraction) are typically small and lack variation in recording conditions, raising concerns about whether models trained on these datasets learn generic or dataset-specific features. On the other hand, although states like stress are well-researched, there are limited public datasets involving HRC tasks. This limitation is exacerbated by the lack of long-term studies involving industrial HRC tasks, limiting our understanding of worker states (e.g., boredom, flow) that emerge over a long period of familiar and repetitive tasks. These limitations of existing datasets form the motivation for the works presented in this thesis.
This thesis explores applicability through multiple lenses: transferability (leveraging features from a related task), generalizability (ensuring models perform well on multiple datasets), replicability (testing approaches on various datasets and recording conditions), reproducibility (recreating industrial HRC experiences), and versatility (utilizing features/models for multiple tasks). The investigations of this thesis are presented in two parts. The first part addresses transferability, generalizability, and replicability by utilizing transfer learning techniques to train various models and assess them using explainable AI methods and cross-dataset evaluations. The second part addresses reproducibility and versatility by analyzing user studies in simulated industrial HRC scenarios with durations ranging from half an hour to several days. The results of this thesis not only demonstrate approaches to develop models applicable to industrial HRC settings but also identify potential avenues for improvement. These findings form the foundations for developing models that enhance human-robot collaboration in industrial environments by focusing on both efficiency and worker well-being.…

