Reading Direct Speech Quotes Increases Theta Phase-locking: Evidence for Theta Tracking of Inner Speech, 2016-2019

Yao, Bo (2021). Reading Direct Speech Quotes Increases Theta Phase-locking: Evidence for Theta Tracking of Inner Speech, 2016-2019. [Data Collection]. Colchester, Essex: UK Data Service. 10.5255/UKDA-SN-854892

Written communication (e.g., emails, news reports, social media) is a major form of social information exchange in today's world. However, it is sometimes difficult to interpret the intended meaning of a written message without hearing prosody (rhythm, stress, and intonation of speech) that is instrumental in understanding the writer's feelings, attitudes, and intentions. For example, a prosody-less "thank you" email can be confusing as to whether the sender is being sincere or sarcastic (Kruger et al., 2005). Emails like these are often misinterpreted as being more negative or neutral than intended; such miscommunications can damage social cohesiveness and group identity within organisations and communities, thereby undermining economic performance and societal stability (Byron, 2008). Interestingly, written words may not be entirely "silent" after all. My recent research showed that we mentally (or covertly) simulate speech prosody (or "inner voices") during silent reading of written direct quotations (Mary gasped: "This dress is beautiful!") as if we were hearing someone speaking (Yao et al., 2011, 2012). For example, Yao and colleagues (2011) observed that silent reading of direct quotations elicited higher neural activity in voice-selective areas of the auditory cortex as compared to silent reading of meaning-equivalent indirect speech (Mary gasped that the dress was beautiful.). Can such covert prosody compensate for the lack of overt speech prosody in written language and thus enhance written communication? To address this question, the proposed project will systematically examine the nature (is covert prosody sound- or action-based in nature?), mechanisms (what information processing systems are engaged?) and emotional consequences (does covert prosody induce emotions and thereby influence behaviour?) of covert prosodic processing in silent reading of written direct quotations. Theoretically motivated by the working neural models for "overt" emotional prosodic processing in speech (e.g., Schirmer & Kotz, 2006), the current proposal will probe "where" and "when" in the brain covert prosodic cues of various natures are mentally simulated and integrated into coherent covert prosodic representations and how these representations consequently induce emotional responses and aid in inferring the quoted speaker's mental state. Using complementary neuroimaging techniques, it will localise the neural substrates of systems engaged in covert emotional prosodic processing (fMRI), specify the time courses of the information processes within these systems (EEG, MEG), and integrate this information to form a unified spatio-temporal neural model for covert emotional prosodic processing. The findings of this project have clear implications for the theoretical development of emotional prosody-based social communication, embodied cognition, and speech pragmatics, and will be of interest to all written language users (e.g. communication-based enterprises, social services, and the wider public). This research also has potential impact on early language education and diagnosis of Parkinson's disease (PD). For example, understanding direct quotations requires the reader to take the quoted speaker's perspective and attribute emotions and mental states to them. A quotation-rich teaching method thus may effectively enhance children's Theory of Mind ability (ability to attribute mental states) that is crucial in their cognitive development and social cognition. Moreover, PD patients may struggle in simulating covert emotional prosody due to their motor (articulation) dysfunction. Consequently, they may display difficulty in understanding figurative speech quotations (e.g., they may not detect the sarcasm in - She rolled her eyes, grumbling: "What a sunny day!"). This research could thus motivate the development of a low-cost quotation-based diagnostic tool for monitoring PD progression.

Data description (abstract)

Growing evidence shows that theta-band (4-7Hz) activity in the auditory cortex phase-locks to rhythms of overt speech. Does theta activity also encode the rhythmic dynamics of inner speech? Previous research established that silent reading of direct speech quotes (e.g., Mary said: “This dress is lovely!”) elicits more vivid inner speech than indirect speech quotes (e.g., Mary said that the dress was lovely). As we cannot directly track the phase alignment between theta activity and inner speech over time, we used EEG to measure the brain’s phase-locked responses to the onset of speech quote reading. We found that direct (vs. indirect) quote reading was associated with increased theta phase synchrony over trials at 250-500 ms post-reading onset, with sources of the evoked activity estimated in the speech processing network. An eye-tracking control experiment confirmed that increased theta phase synchrony in direct quote reading was not driven by eye movement patterns, and more likely reflects synchronous phase resetting at the onset of inner speech. These findings suggest a functional role of theta phase modulation in reading-induced inner speech.

Data creators:
Creator NameEmailAffiliationORCID (as URL)
Yao, BoBo.Yao@manchester.ac.ukUniversity of ManchesterUnspecified
Sponsors: Economic and Social Research Council
Grant reference: ES/N002784/1
Topic classification: Psychology
Keywords: READING COMPREHENSION, SPEECH, COGNITION
Project title: When words speak off the page: Covert emotional prosodic processing in silent reading of direct quotations
Grant holders: Bo Yao
Project dates:
FromTo
7 March 201631 December 2019
Date published: 27 May 2021 14:34
Last modified: 29 Jun 2021 12:25

Available Files

Data and documentation bundle

Documentation

Downloads

data downloads and page views since this item was published

View more statistics

Altmetric

Edit item (login required)

Edit Item Edit Item