Daly, Ian and Tymkiw, Michael and Di Giuseppantonio, Paola and Al-Taie, Inas and Williams, Duncan (2022). Sonic Enhancement of Virtual Exhibits Data, 2020-2021. [Data Collection]. Colchester, Essex: UK Data Service. 10.5255/UKDA-SN-855667
A common feature of UK museums is their recent embrace of virtual exhibits. However, despite this explosion of virtual exhibits, relatively little attention has been given to how sound may be used to create a more engaging experience for audiences. That is, although museums have certainly included sound as parts of virtual exhibits (e.g., through narrated tours or through the addition of music), there is little scholarship about how sound may be harnessed to influence the length of time spent looking at objects within a virtual exhibit; the emotional response audiences have while viewing these objects; or the level of attention or distraction experienced. We conducted a series of experiments to develop a more rigorous understanding of how sound shapes audiences’ experiences in museums.
This project builds on our team’s expertise in sound-tracking and the use of sound effects to influence audience engagement. We also build on our research exploring how sound may be used to modify the affective state of an individual, as well as our research which showed how state-of-the-art neural engineering may be used to dynamically modulate sound and music over time to optimize an “affective trajectory” (a change in felt affect in an audience over time). The project specifically focused on the use of sound effects in the virtual environment, building on our previous studies aimed at understanding how new technologies affect perception and understanding of heritage.
Data description (abstract)
We conducted an online experiment to explore how sound influences the interest level, emotional response, and engagement of individuals who view objects within a virtual exhibit. As part of this experiment, we designed a set of different soundscapes, which we presented to participants who viewed museum objects virtually. We then asked participants to report their felt affect and level of engagement with the exhibits.
This dataset contains the data we recorded in these experiments and used in the analysis presented in our paper entitled 'Sonic Enhancement of Virtual Exhibits'.
Data creators: |
|
||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Contributors: |
|
||||||||||||||||||
Sponsors: | Higher Education Innovation fund | ||||||||||||||||||
Grant reference: | N/A | ||||||||||||||||||
Topic classification: |
Science and technology Society and culture |
||||||||||||||||||
Keywords: | SOUND RECORDINGS, INTEREST (COGNITIVE PROCESSES), EMOTIONAL STATES | ||||||||||||||||||
Project title: | Neuro-curation – Sonic enhancement of virtual exhibits | ||||||||||||||||||
Grant holders: | Ian Daly, Michael Tymkiv, Paola Di Giuseppantonio | ||||||||||||||||||
Project dates: |
|
||||||||||||||||||
Date published: | 30 Jun 2022 11:02 | ||||||||||||||||||
Last modified: | 30 Jun 2022 11:02 | ||||||||||||||||||