The data set contains information on used stimuli, the demographics, the rating times and the self-reported valence-arousal data of the validation experiment.
Abstract - In this project a continuous pictographic scale for self-reported assessment of affective states in virtual environments was developed and validated. Simply put, it is a virtual, 3D character whose facial expression can be adjusted with simple controller gestures according to the perceived affective state. The set facial expression can be translated into a valence and an arousal value. The spectrum of adjustable facial expressions is based on the pictographic scale of the Pick-A-Mood (PAM) questionnaire (Desmet, Vastenburg, Romero, 2016). The developed tool, called Morph A Mood (MAM), was validated in an experiment in which the participants (N=32) watched several one-minute excerpts from music videos of the DEAP database (Koelstra et al., 2012) within a virtual environment and assessed their mood after each clip. MAM, PAM and the Self-Assessment Manikin (SAM) (Bradley & Lang, 1994) were used in alternation. PAM and SAM were available as paper-pencil questionnaires (PPQ). SAM was also provided as a virtual reality questionnaire (VRQ). A comparison of all measuring methods of this experiment showed a high correlation with regard to valence, but only a moderate one with regard to arousal. There are no statistically significant differences between the assessments of SAM and MAM collected in this experiment, but between the valence values of MAM and the SAM values of the DEAP database and between the arousal values of MAM and PAM. In terms of user experience, MAM and PAM hardly differ. Assessments with VRQ are significantly faster than assessments with PPQ, where media devices such as headphones and display goggles must be put on and taken off.
MAM is a quick and intuitively to use method for assessing affective states in virtual environments.
Christian Krüger • February 2020 • hello@christiankruger.de