Behavioural and eye-tracking, Experimental data

Jackson, Margaret (2018). Behavioural and eye-tracking, Experimental data. [Data Collection]. Colchester, Essex: UK Data Archive. 10.5255/UKDA-SN-852758

It is fundamental to normal, human social interaction that we are able to read other people's faces and infer from their expression how they are feeling and what their behavioural intention towards us may be. This is particularly important when a person exhibits a facial expression such as anger, which signals hostility or aggression and threatens our physical and emotional welfare. However, facial expressions are fleeting, lasting only between 0.5 and 4 seconds, so it is essential that we clearly and accurately remember who expressed an emotion and where that person is so that we can respond in an appropriate way. To accurately remember who was where at a particular moment in time, we need to preserve this information in a temporary and short-lived memory store called visuo-spatial working memory. Working memory is an essential component of human life. It is used during every goal-related task, behaviour, and thought, and it enables us to keep track of unfolding events from second to second. Without working memory our daily lives would be chaotic and unmanageable, and it is important to understand how this special kind of memory supports social information processing. In the current project I will develop a novel experimental task which investigates how threatening (angry) versus non-threatening (happy, sad, fearful) expressions of emotion influence how precisely we can recall the location of faces using visuo-spatial working memory. Research has shown that angry faces rapidly attract and hold attention better than non-threatening expressions of emotion. There is also evidence that angry expressions improve the accuracy with which we use working memory to recognise a person. However, we currently have a very limited understanding of how facial expressions influence our ability to remember where that person was. Participants will be shown a number of angry and non-threatening faces, and they have to store these faces and their locations in working memory. These 'study' faces will then disappear for a second or so, after which one of them will reappear in a new location but with a neutral expression (now a 'test' face). The disappearance of the initial emotional expression to a neutral pose in the test face thus mirrors the fleeting nature of facial expressions in real life. Using a touch-screen, participants will use their finger to reposition the test face to its original location. The precision with which this response is made will be measured in terms of the distance between the original and recalled locations. Eye movements will also be recorded using an eye-tracking device, in order to examine how much each face is looked at. These eye movement patterns will help determine whether the amount of attention paid to a face and its location relates to how precisely it is subsequently repositioned. The presence of an angry expression is predicted to enhance recall precision for who was where. It is also expected that angry faces will require less attention than non-threatening faces in order to remember their location accurately, due to the presence of threat stimulating the brain to process this information more rapidly and efficiently. In some experiments the amount of time participants are required to hold the faces in mind immediately after their disappearance (called the maintenance phase) will be increased. If memory for angry individuals is stronger and more durable over time, it is expected that recall of who was where will remain more precise for angry than non-threatening faces at extended maintenance intervals. Humans can vary greatly in their sensitivity and response to social signals. Using questionnaires, levels of social skills and anxiety will be assessed in each individual and compared with their memory performance and eye movement patterns in order to determine whether some people respond differently to social threat than others.

Data description (abstract)

Data is of three types (1) Working memory performance behavioural data, (2) eye movement data, (3) Questionnaire data. This project investigates how threatening versus non-threatening expressions of emotion deferentially modulate the precision and durability of face identity-location bindings in visuospatial working memory. Experiments 1-4 comprise the main data and results of a visuo-spatial WM task in which participants (aged between 18-40 years) were asked to remember the identity and location of between 1 to 4 faces presented on a computer screen. After a maintenance period, a test face was presented in the centre of the screen and participants had to relocate this face to where it was. Faces during the encoding period conveyed emotion whereas the test face was neutral. In 4 experiments (Experiments 1, 2, 3a/b/c, 4), we manipulated different parameters such as: the number of study faces (Experiment 1),the duration of the maintenance period (Experiment 2), the type and the number of emotions present at encoding (Experiments 3 and 4). An additional experiment (Experiment 0) was conducted early on to assess the influence of competing emotions at encoding on purely visual WM.

Data creators:
Creator Name Affiliation ORCID (as URL)
Jackson Margaret University of Aberdeen
Contributors:
Name Affiliation ORCID (as URL)
Spotorno Sara University of Glasgow
Poncet Marlene University of St Andrews
Sponsors: Economic and Social Research Council
Grant reference: ES/L008912/1
Topic classification: Psychology
Keywords: memory
Project title: Remembering who was where: The influence of threatening emotional expressions on visuo-spatial working memory for faces
Grant holders: Margaret Jackson
Project dates:
FromTo
2 March 201531 March 2017
Date published: 17 Aug 2017 15:17
Last modified: 27 Jun 2018 15:33

Available Files

Data

Documentation

Read me

Downloads

data downloads and page views since this item was published

View more statistics

Altmetric

Edit item (login required)

Edit Item Edit Item