Search Abstracts | Symposia | Slide Sessions | Poster Sessions
Information dynamics underlying speech perception
Poster Session B, Friday, September 12, 4:30 - 6:00 pm, Field House
This poster is part of the Sandbox Series.
Chantal Oderbolz1, Joan Orpella1; 1Georgetown University Medical Center
The analysis of the speech signal unfolds rapidly within a distributed network of brain regions. These (often functionally specialized) brain regions extract critical acoustic and linguistic features. An important question concerns how the brain (spatially and temporally) orchestrates the processing of the various kinds of information for successful speech perception, that is what the underlying information dynamics in the speech and language network are. We use information theoretical measures to characterize and quantify these dynamics (Francis et al. 2022). First, we validate our methods by applying them to fMRI data from a binaural integration task (Preisig et al. 2021). In this task, an ambiguous speech sound (a CV syllable midway between /da/ and /ga/) is presented to the right ear while a disambiguating spectral feature (third formant, F3) is presented to the left ear. Because the successful completion of this task (i.e., the integration of unique acoustic features presented to either ear) is fully reflected in a listener’s behavior (i.e. the correct identification of a CV syllable), the task allows us to accurately characterize how the dynamics of information processing inform perception. Binaural integration was disrupted in half of the trials by means of transcranial direct current stimulation, giving us additional insight into the (in)dispensability of this relationship. Next, we expand our focus of inquiry to investigate the information dynamics underlying the perceptual classification of a fundamental linguistic feature, i.e. sentence type. Acoustically, the distinction between a question and a statement is strongly cued by a rising or a falling fundamental frequency (f0) contour, respectively (Studdert-Kennedy and Hadding 1973). We analyze source-localized magnetoencephalography data from a recent experiment in which participants listened to and classified single words according to sentence type (Oderbolz, Orpella, and Meyer, submitted). The words’ f0 contours were manipulated to create a continuum between a prototypical question and a prototypical statement. In a forthcoming paper, we show that the behaviorally- and perceptually-relevant representation of the sentence type distinction lies in the left frontocentral operculum and anterior insula (Op/aINS). We now reveal that the left Op/aINS receives a high degree of unique information from the right hemisphere, especially regions in the frontal and temporal cortex, in line with its well-established functional specialization for slowly changing acoustic signals (Oderbolz, Poeppel, and Meyer 2025). In turn, the left early auditory and inferior frontal cortex contribute redundant as well as synergistic information. Together, these information dynamics interact to produce the neural representation of sentence type that informs participants’ behavior and perception. Conceptualizing information as a dynamic and decomposable entity by means of information theory provides the necessary level of nuance to better understand how complex cognition and behavior arise. Although we apply these measures to questions relevant to speech and language, we emphasize their potential for the assessment of information dynamics in both basic and clinical research and in a wide range of fields.
Topic Areas: Speech Perception,