One shot crowdtesting: approaching the extremes of crowdsourced subjective quality testing

  • Crowdsourcing studies for subjective quality testing have become a particularly useful tool for Quality of Experience researchers. Typically, crowdsouring studies are conducted by many unsupervised workers, which rate the perceived quality of several test conditions during one session (mixed within-subject test design). However, those studies often show to be very sensitive, for example, to test instructions, design, and filtering of unreliable participants. Moreover, the exposure of several test conditions to single workers potentially leads to an implicit training and anchoring of ratings. Therefore, this works investigates the extreme case of presenting only a single test condition to each worker (completely between-subjects test design). The results are compared to a typical crowdsourcing study design with multiple test conditions to discuss training effects in crowdsourcing studies. Thus, this work investigates if it is possible to use a simple 'one shot' design with only oneCrowdsourcing studies for subjective quality testing have become a particularly useful tool for Quality of Experience researchers. Typically, crowdsouring studies are conducted by many unsupervised workers, which rate the perceived quality of several test conditions during one session (mixed within-subject test design). However, those studies often show to be very sensitive, for example, to test instructions, design, and filtering of unreliable participants. Moreover, the exposure of several test conditions to single workers potentially leads to an implicit training and anchoring of ratings. Therefore, this works investigates the extreme case of presenting only a single test condition to each worker (completely between-subjects test design). The results are compared to a typical crowdsourcing study design with multiple test conditions to discuss training effects in crowdsourcing studies. Thus, this work investigates if it is possible to use a simple 'one shot' design with only one rating of a large number of workers instead of sophisticated (mixed or within-subject) test designs in crowdsourcing.show moreshow less

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Michael SeufertORCiDGND, Tobias Hoßfeld
URN:urn:nbn:de:bvb:384-opus4-1074207
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/107420
ISSN:2312-2846OPAC
Parent Title (English):5th ISCA/DEGA Workshop on Perceptual Quality of Systems (PQS 2016), 29-31 August 2016, Berlin, Germany
Publisher:International Speech Communication Association
Place of publication:Baixas
Editor:Sebastian Möller, Sebastian Egger
Type:Conference Proceeding
Language:English
Year of first Publication:2016
Publishing Institution:Universität Augsburg
Release Date:2023/10/11
First Page:122
Last Page:126
DOI:https://doi.org/10.21437/PQS.2016-26
Institutes:Fakultät für Angewandte Informatik
Fakultät für Angewandte Informatik / Institut für Informatik
Fakultät für Angewandte Informatik / Institut für Informatik / Lehrstuhl für vernetzte eingebettete Systeme und Kommunikationssysteme
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Licence (German):Deutsches Urheberrecht