Functional magnetic resonance imaging data of brain area activity when recognising facial expressions

Furl, Nicholas (2017). Functional magnetic resonance imaging data of brain area activity when recognising facial expressions. [Data Collection]. Colchester, Essex: UK Data Archive. 10.5255/UKDA-SN-851780

Although a person's facial identity is immutable, faces are dynamic and undergo complex movements which signal critical social cues (viewpoint, eye gaze, speech movements, expressions of emotion and pain).  These movements can confuse automated systems, yet humans recognise moving faces robustly.

Our objective is to discover the stimulus information, neural representations and computational mechanisms that the human brain uses when recognising social categories from moving faces. We will use human brain imaging to put an existing theory to the test. This theory proposes that recognition of changeable attributes (eg, expression) and facial identity are each recognised separately by two different brain pathways, each in a different part of the temporal lobe of the brain.

The evidence we provide might indeed support and fill in many gaps in this theory. Nevertheless, we expect instead to instantiate a new alternative theory. By this new theory, some brain areas can recognise both identities and expressions, using unified representations, with one of the two pathways specialised for representing movement. Thus, the successful completion of our project will provide a new theoretical framework sufficient to motivate improved automated visual systems and advance new directions of research on human social perception.

Data description (abstract)

Data resulting from an experiment which used brain scanning or functional magnetic resonance imaging (fMRI) to investigate the brain areas active when recognising facial expressions and to learn about how they are connected and how they communicate with each other.

The dataset consists of volumetric 3D scans of brains, necessarily stored in a special, purpose-made file format. The dataset also contains information necessary for analysing the data, i.e. stimuli and their onsets times. The dataset lastly contains participant ratings of the stimuli collected in a behavioural testing session, following scanning.

Our analyses of these data are reported in papers:
(1) Furl N, Henson RN, Friston KJ, Calder AJ. 2013. Top-down control of visual responses to fear by the amygdala. J Neurosci 33:17435-43.
(2) Furl N, Henson RN, Friston KJ, Calder AJ. 2013. Network Interactions Explain Sensitivity to Dynamic Faces in the Superior Temporal Sulcus. Cereb Cortex. 2015 Sep; 25(9): 2876–2882.

Data creators:
Creator Name Affiliation ORCID (as URL)
Furl Nicholas MRC Cognition and Brain Sciences Unit, Cambridge
Sponsors: ESRC
Grant reference: ES/I01134X/2
Topic classification: Psychology
Keywords: perception, fear, psychological research, perceptual processes
Project title: Neural Representation of the Identities and Expressions of Human Faces
Grant holders: nicholas furl, Richard Henson, Karl Friston, Andrew Calder
Project dates:
FromTo
26 September 20111 December 2015
Date published: 12 Jan 2016 16:31
Last modified: 14 Jul 2017 10:20

Available Files

Data

Documentation

Downloads

data downloads and page views since this item was published

View more statistics

Altmetric

Edit item (login required)

Edit Item Edit Item