Full Body Interaction: Design, Implementation, and User Support

  • While researchers have been working on device-free gestural interaction for multiple decades, full body interaction was first truly introduced to the mass market by the Microsoft Kinect for Xbox 360 in November 2010. Full body interaction is meant as a very unobtrusive and natural interaction modality; users can interact with a computer via motions of the whole body and without touching, wearing or holding any device or special gear. However, there are many differences between full body interaction and traditional interaction technologies such as the mouse and keyboard. It is not easy to provide good usability throughout the interaction because of its lower precision and higher complexity. Thus, multiple challenges still remain until full body interaction will gain further acceptance. In this dissertation, I investigate those challenges and present three major contributions: In the first, I follow a user-centered design process to create gesture sets that, on the one hand, areWhile researchers have been working on device-free gestural interaction for multiple decades, full body interaction was first truly introduced to the mass market by the Microsoft Kinect for Xbox 360 in November 2010. Full body interaction is meant as a very unobtrusive and natural interaction modality; users can interact with a computer via motions of the whole body and without touching, wearing or holding any device or special gear. However, there are many differences between full body interaction and traditional interaction technologies such as the mouse and keyboard. It is not easy to provide good usability throughout the interaction because of its lower precision and higher complexity. Thus, multiple challenges still remain until full body interaction will gain further acceptance. In this dissertation, I investigate those challenges and present three major contributions: In the first, I follow a user-centered design process to create gesture sets that, on the one hand, are intuitive and easy to reproduce for the actual users, while on the other hand, are consistent, unambiguous, and can be recognized with low-cost technology. The second and main technical contribution of this dissertation is the Full Body Interaction (FUBI) framework, which can be used to easily integrate full body interaction in arbitrary applications, using an XML-based gesture-definition language that supports powerful gesture recognition. In addition, FUBI can be used to implement freehand interaction with a graphical user interface (GUI) or to implement avatar control. Besides being able to integrate full body interaction in an application, it is also important to support the user during the interaction. The third contribution therefore focuses on mechanisms such as affordances, feedback and feedforward, to help the users understand which gestures are currently available, how they should be performed, but also why they may not be recognized in certain cases. In this work, I focus mainly on virtual environments, which are especially suited for full body interaction. To prove the generalizability of my research, I further look at application scenarios in which a user controls GUIs or humanoid robots. Overall, I present concepts, implementations and study results to provide insights on how to improve the process of creating full body interaction applications. I therefore take into account all stake-holders of full body interaction: the interaction designer, the developer, as well as the end user.show moreshow less

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Felix Kistler
URN:urn:nbn:de:bvb:384-opus4-34031
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/3403
Advisor:Elisabeth André
Type:Doctoral Thesis
Language:English
Publishing Institution:Universität Augsburg
Granting Institution:Universität Augsburg, Fakultät für Angewandte Informatik
Date of final exam:2015/12/15
Release Date:2016/01/20
Tag:human-computer interaction; gesture recognition; user-defined gestures; body gesture taxonomy; gesture visualizations
GND-Keyword:Mensch-Maschine-Kommunikation; Mustererkennung; Visualisierung
Institutes:Fakultät für Angewandte Informatik
Fakultät für Angewandte Informatik / Institut für Informatik
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Licence (German):Deutsches Urheberrecht