Beyond core language: Prosody and pragmatics
Sunday, September 14, 3:45 - 5:45 pm, Elstad Auditorium
Organizer: Anna Greenwald1; 1Georgetown University Medical Center
Presenters: Tamar Regev, Seung-Cheol Baek, Alexandra Zezinka Durfee, Anna Seydell-Greenwald, Jamila Minga
Since the days of Broca and Wernicke, “language” has been associated predominantly with the brain’s left hemisphere. Challenging this one-sided view, this symposium will highlight aspects of language for which the right hemisphere plays an equally important and possibly dominant role: pragmatics, with a particular focus on prosody. How and in which context something is said holds crucial cues regarding the meaning of what is said, and a failure to account for this can have devastating effects on communication (and sometimes, research design). Speakers with backgrounds in linguistics, cognitive neuroscience, and speech-language-pathology will illuminate this topic from different angles, covering the spectrum from basic to clinical in individual talks, followed by an interactive discussion with the audience. This symposium should be of broad interest to SNL attendees of all training backgrounds and at all career levels, and we hope that it will inspire many future collaborations.
Presentations
A network of brain areas is selective for prosody—the melody of speech
Tamar Regev1, Hee So Kim1, Niharika Jhingan1, Sara Swords1, Hope Kean1, Colton Casto1, Evelina Fedorenko1; 1Massachusetts Institute of Technology
Human speech carries information beyond the phonetic content: pitch, loudness and timing cues—jointly referred to as ‘prosody’—can emphasize parts of the linguistic message and convey emotional and other social information. Here, we develop an fMRI paradigm to identify prosody-sensitive brain areas and then richly characterize these areas across 9 experimental tasks (25 experimental conditions; n=51 participants). We find that prosody-sensitive areas are located on the lateral temporal and lateral frontal cortical surface bilaterally, in proximity to lower-level auditory perception areas, amodal language-processing areas, and areas that process visual social cues, but crucially, we show that the prosody-sensitive areas are dissociable from all these known areas. Thus, in adult brains, speech prosody is processed by a distinct set of areas, although their proximity to other communication-relevant areas suggests that at an earlier developmental, and perhaps, evolutionary stage, prosody-processing areas may overlap with linguistic, or general social areas.
Representational dynamics of speech prosody in ventral and dorsal speech streams
Seung-Cheol Baek1,2,3, Seung-Goo Kim1, Burkhard Maess4, Maren Grigutsch4, Daniela Sammler1,4; 1Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany, 2Philipps-University Marburg, Germany, 3Center for Mind, Brain, and Behavior, University of Marburg and Gießen, Germany, 4Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
Speech prosody, often characterized by specific pitch patterns, conveys a speaker’s meaning, for example, distinguishing a question from a statement. Although the neuroanatomical basis of prosody is well established in frontotemporal networks with right-hemispheric predominance, a more fundamental question remains unanswered: How is prosodic information encoded in these networks? Using magnetoencephalography and behavioral psychophysics, combined with time-resolved representational similarity and multivariate transfer entropy analyses, we examined how prosodic information is abstracted from acoustic to categorical representations across space and time. We found serial, yet joint and distributed representations of prosodic acoustics and categories bilaterally across the ventral auditory stream. Prosodic information additionally propagated along the dorsal stream, where the right premotor cortex exhibited only categorical, not acoustic representations. Altogether, our findings demonstrate the representational dynamics of prosody in ventral and dorsal speech streams, where the dorsal stream contributes particularly to prosodic category formation—key to understanding a speaker’s meaning.
Amalgamating classic and modern taxonomies of affective aprosodia to refine post-stroke language rehabilitation
Alexandra Zezinka Durfee1,2, Shannon M. Sheppard2, Argye E. Hillis2; 1Towson University, 2Johns Hopkins University
Classic affective aprosodia taxonomy (e.g., sensory aprosodia) is limited in describing underlying behavioral mechanisms/deficits, which is central to modern aprosodia taxonomies (e.g., receptive aprosodia d/t impaired semantic access). Therefore, associations between classic and modern aprosodia taxonomies in acute right hemisphere stroke (n=36) were investigated. Classic aprosodia taxonomy subtyped some (~20%) participants, who also demonstrated underlying prosodic decoding/encoding and emotional-cognitive-linguistic deficits according to modern aprosodia taxonomy. Unclassifiable participants using classic taxonomy demonstrated varying skill profiles with modern taxonomy, including impaired access to a proposed “affective prosody lexicon” for production and/or recognition. Preliminary lesion maps indicated posterior lesions sparing inferior frontal and temporoparietal regions with receptive-only aprosodias, inferior frontal and temporoparietal regions with expressive-only aprosodias, and deep rolandic operculum and posterior parietal damage with receptiv+expressive aprosodias. Unclassifiable participants demonstrated greatest lesion overlap in subcortical/deep nuclei. Discussion includes combined classification implications for aprosodia rehabilitation.
Altered cerebral activation for prosody after right-hemisphere stroke
Anna Seydell-Greenwald1; 1Georgetown University Medical Center
This talk will report ongoing fMRI studies investigating whether emotional prosody comprehension evokes increased left-hemisphere activation after right-hemisphere stroke in two study populations: Children and adolescents who had a large right-hemisphere stroke to the middle cerebral artery territory around the time of birth, and adults who had right-hemisphere strokes of varying sizes and locations in adulthood, more than a year prior to study. Mirroring the finding of an increased reliance on right perisylvian cortex for sentence comprehension in survivors of large perinatal strokes to the left hemisphere, survivors of large right-hemisphere perinatal strokes show an increased reliance on left-hemisphere perisylvian cortex during prosody processing. Findings after right-hemisphere stroke in adulthood are more heterogeneous and suggest a modulation by lesion characteristics, mirroring findings of increased right-hemisphere activation in adults with aphasia subsequent to left-hemisphere stroke. Results will be discussed in terms of functional reorganization after stroke.
RHDBank and acquired language disorders after right hemisphere stroke
Jamila Minga1; 1Duke University School of Medicine
Nearly 50% of strokes in the United States result in right hemisphere damage, yet acquired language disorders after stroke primarily focus on aphasias after left hemisphere damage. Right hemisphere brain damage (RHD) results in apragmatism, an acquired pragmatic language disorder in the ability to understand and produce contextually appropriate language, prosody, gestures, and facial expressions. Unlike aphasia, survivors with apragmatism can form sentences, produce words, and have intact fluency of expression. The language is, however, distinctly abnormal pragmatically. Growing research has outlined RHD communication features using the RHDBank, an internationally shared database for the study of language use after right hemisphere stroke during discourse. In this presentation, we highlight the importance of a diagnostic label and characterization of language production after right hemisphere stroke using existing literature and RHDBank video recorded data to stress the clinical relevance and need for greater empirical and educational consideration of RHD language disorders.