Moods and activities in music

Eerola, Tuomas and Saari, Pasi (2018). Moods and activities in music. [Data Collection]. Colchester, Essex: UK Data Archive. 10.5255/UKDA-SN-852024

Current approaches to the tagging of music in online databases predominantly rely on music genre and artist name, with music tags being often ambiguous and inexact. Yet, the possibly most salient feature musical experiences is emotion. The few attempts so far undertaken to tag music for mood or emotion lack a scientific foundation in emotion research. The current project proposes to incorporate recent research on music-evoked emotion into the growing number of online musical databases and catalogues, notably the Geneva Emotional Music Scale (GEMS) - a rating measure for describing emotional effects of music recently developed by our group. Specifically, the aim here is to develop the GEMS into an innovative conceptual and technical tool for tagging of online musical content for emotion. To this end, three studies are proposed. In study 1, we will examine whether the GEMS labels and their grouping holds up against a much wider range of musical genres than those that were originally used for its development. In Study 2, we will use advanced data reduction techniques to select the most recurrent and important labels for describing music-evoked emotion. In a third study we will examine the added benefit of the new GEMS compared to conventional approaches to the tagging of music. The anticipated impact of the findings is threefold. First, the research to be described next will advance our understanding of the nature and structure of emotions evoked by music. Developing a valid model of music-evoked emotion is crucial for meaningful research in the social and in the neurosciences. Second, music information organization and retrieval can benefit from a scientifically sound and parsimonious taxonomy for describing the emotional effects of music. Thus, searches for relevant online music databases need not be longer confined to genre or artist, but can also incorporate emotion as a key experiential dimension of music. Third, a valid tagging scheme for emotion can assist both researchers and professionals in the choice of music to induce specific emotions. For example, psychologists, behavioural economists, and neuroscientists often need to induce emotion in their experiments to understand how behaviour or performance is modulated by emotion. Music is an obvious choice for emotion induction in controlled settings because it is a universal language that lends itself to comparisons across cultures and because it is ethically unproblematic.

Data description (abstract)

Data consists of annotations of music in terms of moods music may express and activities that music might fit. The data structures are related to different kinds of annotation tasks, which addressed these questions: 1) annotations of 9 activities that fit a wide range of moods related to music, 2) nominations of music tracks that best fit the a particular mood and annotating the activities that fit them, and 3) annotations of these nominated tracks in terms of mood and activities. Users are anonymised, but the background information (gender, music preferences, age, etc.) are also available. Dataset consists of relational database, that is linked together by means of common ids (tracks, users, activities, moods, genres, expertise, language skill).

Data creators:
Creator Name Affiliation ORCID (as URL)
Eerola Tuomas Durham University, UK http://orcid.org/0000-0002-2896-929X
Saari Pasi Durham University, UK
Sponsors: ESRC
Grant reference: ES/K00753X/1
Topic classification: Society and culture
Psychology
Keywords: music, emotion, information retrieval
Project title: Tagging online music contents for emotion. A systematic approach based on contemporary emotion research
Grant holders: Tuomas Eerola
Project dates:
FromTo
1 June 201430 September 2015
Date published: 17 Dec 2015 12:20
Last modified: 16 Aug 2018 07:55

Available Files

Data

Documentation

Read me

Downloads

data downloads and page views since this item was published

View more statistics

Altmetric

Edit item (login required)

Edit Item Edit Item