Attitudes Towards Emotional Artificial Intelligence Use: Transcripts of Citizen Workshops Collected Using an Innovative Narrative Approach, 2021

Laffer, Alexander (2023). Attitudes Towards Emotional Artificial Intelligence Use: Transcripts of Citizen Workshops Collected Using an Innovative Narrative Approach, 2021. [Data Collection]. Colchester, Essex: UK Data Service. 10.5255/UKDA-SN-855688

CONTEXT Emotional AI (EAI) technologies sense, learn and interact with citizens' emotions, moods, attention and intentions. Using weak and narrow rather than strong AI, machines read and react to emotion via text, images, voice, computer vision and biometric sensing. Concurrently, life in cities is increasingly technologically mediated. Data-driven sensors, actuators, robots and pervasive networking are changing how citizens experience cities, but not always for the better. Citizen needs and perspectives are often ancillary in emerging smart city deployments, resulting in mistrust in new civic infrastructure and its management (e.g. Alphabet's Sidewalk Labs). We need to avoid these issues repeating as EAI is rolled out in cities. Reading the body is an increasingly prevalent concern, as recent pushback against facial detection and recognition technologies demonstrates. EAI is an extension of this, and as it becomes normalised across the next decade we are concerned about how these systems are governed, social impacts on citizens, and how EAI can be designed in a more ethical manner. In both Japan and UK, we are at a critical juncture where these social, technological and governance structures can be appropriately prepared before mass adoption of EAI, to enable citizens, in all their diversity, to live ethically and well with EAI in cities-as-platforms. Building on our ESRC/AHRC seminars in Tokyo (2019) that considered cross-cultural ethics and EAI, our research will enable a multi-stakeholder (commerce, security, media) and citizen-led interdisciplinary response to EAI for Japan and UK. While these are two of the most advanced nations in regard to AI, the social contexts and histories from which these technologies emerge differ, providing rich scope for reflection and mutual learning. AIMS/OBJECTIVES 1. To assess what it means to live ethically and well with EAI in cities in cross-cultural (UK-Japan) commercial, security and media contexts. 2. To map and engage with the ecology of influential actors developing and working with EAI in UK-Japan. 3. To understand commercial activities, intentions and ethical implications regarding EAI in cities, via interviews with industry, case studies, and analysis of patents. 4. To ascertain how EAI might impact security/policing stakeholders, and organisations in the new media ecology, via interviews with these stakeholders and case studies in UK-Japan. 5. To examine governance approaches for collection and use of intimate data about emotions in public spaces to understand how these guide EAI technological developments, and to build a repository of best practice on EAI in cities. 6. To understand diverse citizens' attitudes to EAI via quantitative national surveys and qualitative workshops to co-design citizen-led, creative visions of what it means to live ethically and well with EAI in cities in UK-Japan. 8. To feed our insights to stakeholders shaping usage of EAI in cities in UK-Japan. 9. To advance surveillance studies, new media studies, information technology law, science & technology studies, security & policing studies, computer ethics and affective computing via: 24 international conference papers; a conference on EAI; 12 international, refereed journal papers; a Special Issue on EAI. APPLICATIONS/BENEFITS We will: - Raise awareness of UK-Japanese stakeholders (technology industry, policymakers, NGOs, security services, urban planners, media outlets, citizens) on how to live ethically and well with EAI in cities, via co-designed, citizen-led, qualitative visions fed into Stakeholder Policy Workshops; a Final Report with clear criteria on ethical usage of EAI in cites; 24 talks with stakeholders; multiple news stories. - Set up a think tank to provide impartial ethical advice on EAI and cross-cultural issues to diverse stakeholders during and after the project. - Advance collaboration between UK-Japan academics, disciplines and stakeholders in EAI.

Data description (abstract)

The data were collected during citizen workshops, conducted online via Zoom, exploring attitudes towards emotional artificial intelligence use (EAI). EAI is the use of affective computing and AI techniques to try to sense and interact with human emotional life, ranging from monitoring emotions through biometric data to more active interventions. 10 sets of participants (n=46) were recruited for the following groups: 3 older (65+) groups: n=13 3 younger (18-34) groups: n=12 2 groups, people self-identifying as disabled: n=10 2 groups, members of UK ethnic minorities: n=11 There was an attempt to balance other demographic categories where possible. Participants were grouped in relation to age as this has been shown to be the biggest indicator of differences in attitude towards emotional AI (Bakir & McStay, 2020; McStay, 2020). It was also considered important to include the views of those who have traditionally been ignored in the development of technology or suffered further discrimination through its use, and so the opinions and perspectives of minority groups and disabled people were sought. Participants were recruited through a research panel for the workshops, which took place in August 2021. A novel narrative approach was used, with participants taken through a piece of interactive fiction (developed using Twine, viewable here: https://eaitwine.neocities.org/), a day-in-the life story of a protagonist encountering seven mundane use-cases of emotional AI, each structured as a) a neutral introduction to the technology; b) a binary choice involving the use of the technology; c) a ContraVision component demonstrating positive and negative events/outcomes. The use cases were: • Home-hub smart assistant • Bus station surveillance sensor • Social Media Fake news/Disinformation and profiling. • Spotify music recommendations (using voice and ambient data). • Sales call evaluation and prompt tool • Emotoy that collects and responds to children's emotional data. • Hire car in-cabin customisation and driving support. Each workshop lasted 2 hours. Audio files were transcribed using a transcription service before being corrected and formatted by a project researcher. References: Bakir, V., & McStay, A. (2020). Profiling & Targeting Emotions in Digital Political Campaigns. Briefing Paper for All Party Parliamentary Group on Electoral Campaigning Transparency. McStay, A. (2020). Emotional AI, soft biometrics and the surveillance of emotional life: An unusual consensus on privacy. Big Data & Society, 7(1), 1–12. https://doi.org/10.1177/2053951720904386

Data creators:
Creator Name Affiliation ORCID (as URL)
Laffer Alexander Bangor University https://orcid.org/0000-0003-2463-9135
Sponsors: Economic and Social Research Council
Grant reference: ES/T00696X/1
Topic classification: Media, communication and language
Law, crime and legal systems
Science and technology
Transport and travel
Labour and employment
Society and culture
Keywords: ATTITUDES, INFORMATION AND COMMUNICATIONS TECHNOLOGY, INTELLIGENCE, EMOTIONAL STATES, AGE
Project title: Emotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life
Grant holders: Andrew John McStay
Project dates:
FromTo
1 January 20201 September 2032
Date published: 04 May 2022 10:54
Last modified: 02 Jun 2023 13:59

Available Files

Data

Documentation

Read me

Downloads

data downloads and page views since this item was published

View more statistics

Altmetric

Edit item (login required)

Edit Item Edit Item