Search Abstracts | Symposia | Slide Sessions | Poster Sessions
Investigation of speech-induced sensory modulation using multimodal neuroimaging
Poster Session D, Saturday, September 13, 5:00 - 6:30 pm, Field House
This poster is part of the Sandbox Series.
Vaishak Harish1, Kevin Sitek2, Akshat Dhamale1, Tamer Ibrahim1, Jason Bohland1; 1University of Pittsburgh, 2Northwestern University
Fluent speech requires the brain to predict the sensory consequences of self-produced sounds. A key marker of this predictive coding is speech-induced sensory modulation (SISM), typically indexed by suppression of auditory cortical responses during speech relative to passive listening¹. While studied using EEG/MEG and, to a lesser extent, electrocorticography, its anatomical specificity, underlying mechanisms, and cerebellar contributions remain unclear. This study combines ultra-high field 7T fMRI and EEG to investigate the neural substrates of SISM, including the cerebellum’s role in predictive coding. Participants (right-handed English speakers, aged 18–49) complete a multi-session protocol including hearing assessments, EEG, and 7T fMRI. During EEG, they produce vocalic syllables (“ah”) and passively listen to matched recordings. Event-related potentials time-locked to sound onset are compared across conditions to assess auditory suppression (e.g., N1 attenuation). In fMRI, participants alternate between producing and listening to individual syllables and full sentences. During production, stimuli are visually cued and spoken aloud. In listening blocks, they hear either their own productions (LISTEN-SELF) or matched recordings from others (LISTEN-OTHER), enabling comparison under different predictive contexts. Speech is recorded and played back with minimal delay using MRI-compatible audio systems. T1-weighted MP2RAGE images are collected at 0.55 mm resolution. BOLD fMRI is acquired at 1.5 mm isotropic resolution using a multiband EPI sequence and custom Tic Tac Toe RF coil. We hypothesize that auditory cortical regions—including planum temporale, posterior superior temporal gyrus, and posterior superior temporal sulcus—will show distinct modulation patterns. We expect both increased and decreased BOLD responses during speech, consistent with intracranial findings². While EEG suppression is often linked to cortical inhibition, corresponding fMRI changes may vary with neurovascular dynamics. By comparing EEG and fMRI within subjects, we aim to determine whether suppressed evoked potentials reflect localized increases, decreases, or mixed BOLD patterns. This study also examines the cerebellum’s role in speech prediction. The cerebellum may implement forward models generating inhibitory predictions for expected feedback³⁻⁴. We expect speech-related activity in Lobule VI and test whether BOLD signal predicts auditory cortex modulation. Subset analyses using 0.8 mm scans will explore depth-dependent modulation and cerebellar-cortical connectivity. To date, 7T fMRI has been rarely used in speech production studies. Its enhanced resolution and signal-to-noise ratio provide a unique opportunity to investigate predictive mechanisms at the individual-subject level⁵. We present this work as part of the Sandbox Series to solicit input on design, analysis, and interpretation of multimodal and cerebellar involvement in SISM. References: 1) Ford, J. M., Roach, B. J., & Mathalon, D. H. Assessing corollary discharge in humans using noninvasive neurophysiological methods(2010). 2) Eliades, S. J., & Wang, X. Neural substrates of vocalization feedback monitoring in primate auditory cortex. Nature(2008). 3) Knolle, F., Schröger, E., Baess, P., & Kotz, S. A. The cerebellum generates motor-to-auditory predictions: ERP lesion evidence.(2012). 4) Parrell, B., Agnew, Z., Nagarajan, S., Houde, J., & Ivry, R. B. Impaired feedforward control and enhanced feedback control of speech in patients with cerebellar degeneration(2017). 5) de Martino, F. et al. The impact of ultra-high field MRI on cognitive and computational neuroimaging(2018).
Topic Areas: Speech Motor Control, Multisensory or Sensorimotor Integration