The Reading Everyday Emotion Database (REED): a set of audio-visual recordings of emotions in music and language

Ong, JH ORCID logoORCID: https://orcid.org/0000-0003-1503-8311, Leung, FYN and Liu, F, 2025. The Reading Everyday Emotion Database (REED): a set of audio-visual recordings of emotions in music and language. Language Resources and Evaluation, 59 (1), pp. 27-49. ISSN 1574-020X

Full text not available from this repository.

Abstract

Most audio-visual (AV) emotion databases consist of clips that do not reflect real-life emotion processing (e.g., professional actors in bright studio-like environment), contain only spoken clips, and none have sung clips that express complex emotions. Here, we introduce a new AV database, the Reading Everyday Emotion Database (REED), which directly addresses those gaps. We recorded the faces of everyday adults with a diverse range of acting experience expressing 13 emotions—neutral, the six basic emotions (angry, disgusted, fearful, happy, sad, surprised), and six complex emotions (embarrassed, hopeful, jealous, proud, sarcastic, stressed)—in two auditory domains (spoken and sung) using everyday recording devices (e.g., laptops, mobile phones, etc.). The recordings were validated by an independent group of raters. We found that: intensity ratings of the recordings were positively associated with recognition accuracy; and the basic emotions, as well as the Neutral and Sarcastic emotions, were recognised more accurately than the other complex emotions. Emotion recognition accuracy also differed by utterance. Exploratory analysis revealed that recordings of those with drama experience were better recognised than those without. Overall, this database will benefit those who need AV clips with natural variations in both emotion expressions and recording environment.

Item Type: Journal article
Publication Title: Language Resources and Evaluation
Creators: Ong, J.H., Leung, F.Y.N. and Liu, F.
Publisher: Springer Science and Business Media LLC
Date: March 2025
Volume: 59
Number: 1
ISSN: 1574-020X
Identifiers:
Number
Type
10.1007/s10579-023-09698-5
DOI
2418457
Other
Rights: This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Divisions: Schools > School of Social Sciences
Record created by: Jonathan Gallacher
Date Added: 02 Apr 2025 07:16
Last Modified: 02 Apr 2025 07:16
URI: https://irep.ntu.ac.uk/id/eprint/53346

Actions (login required)

Edit View Edit View

Statistics

Views

Views per month over past year

Downloads

Downloads per month over past year