The integration of continuous audio and visual speech in a cocktail-party environment depends on attention

Ahmed, F., Nidiffer, A.R., O'Sullivan, A.E., Zuk, N. ORCID: 0000-0002-2466-6718 and Lalor, E.C., 2023. The integration of continuous audio and visual speech in a cocktail-party environment depends on attention. NeuroImage, 274: 120143. ISSN 1053-8119

Full text not available from this repository.

Abstract

In noisy environments, our ability to understand speech benefits greatly from seeing the speaker's face. This is attributed to the brain's ability to integrate audio and visual information, a process known as multisensory integration. In addition, selective attention plays an enormous role in what we understand, the so-called cocktail-party phenomenon. But how attention and multisensory integration interact remains incompletely understood, particularly in the case of natural, continuous speech. Here, we addressed this issue by analyzing EEG data recorded from participants who undertook a multisensory cocktail-party task using natural speech. To assess multisensory integration, we modeled the EEG responses to the speech in two ways. The first assumed that audiovisual speech processing is simply a linear combination of audio speech processing and visual speech processing (i.e., an A + V model), while the second allows for the possibility of audiovisual interactions (i.e., an AV model). Applying these models to the data revealed that EEG responses to attended audiovisual speech were better explained by an AV model, providing evidence for multisensory integration. In contrast, unattended audiovisual speech responses were best captured using an A + V model, suggesting that multisensory integration is suppressed for unattended speech. Follow up analyses revealed some limited evidence for early multisensory integration of unattended AV speech, with no integration occurring at later levels of processing. We take these findings as evidence that the integration of natural audio and visual speech occurs at multiple levels of processing in the brain, each of which can be differentially affected by attention.

Item Type: Journal article
Publication Title: NeuroImage
Creators: Ahmed, F., Nidiffer, A.R., O'Sullivan, A.E., Zuk, N. and Lalor, E.C.
Publisher: Elsevier
Date: July 2023
Volume: 274
ISSN: 1053-8119
Identifiers:
NumberType
10.1016/j.neuroimage.2023.120143DOI
S105381192300294XPublisher Item Identifier
1827164Other
Rights: This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
Divisions: Schools > School of Social Sciences
Record created by: Jonathan Gallacher
Date Added: 01 Nov 2023 10:17
Last Modified: 01 Nov 2023 10:17
URI: https://irep.ntu.ac.uk/id/eprint/50182

Actions (login required)

Edit View Edit View

Views

Views per month over past year

Downloads

Downloads per month over past year