Search Abstracts | Symposia | Slide Sessions | Poster Sessions
How Minds Connect: Semantic Dimensions Drive Differential Patterns of Neural Synchronization
Poster Session C, Saturday, September 13, 11:00 am - 12:30 pm, Field House
Ximing Shao1, Chen Hong1, Gangyi Feng1; 1Chinese University of Hong Kong
Communication underpins human social interactions, allowing thoughts to flow between minds. Each utterance carries rich contextual semantic information across sensorimotor, emotional, intentional, and attitudinal dimensions. Sharing these representations is believed to be supported by inter-brain neural synchronization (NS), which measures neural alignment between speakers and listeners. We propose that effective communication relies on the exchange of various semantic dimensions, each influencing NS differently over space and time. Grounded in embodied-cognition theory and the social-brain hypothesis, we predict that perceptual, emotional, and socially relevant semantics significantly impact NS due to their evolutionary importance. We conducted a pseudo-hyperscanning MEG experiment using a storytelling-listening paradigm involving a native Mandarin speaker and 22 listeners. Ten semantic dimensions (perception, motor, socialness, emotion, time, space, animacy, causality, drive, and attention) were evaluated for content words (406 nouns and 360 verbs) by independent raters. We employed a novel speaker-listener predictive encoding modeling approach to estimate each dimension's contribution to NS while controlling for acoustic and phonetic features. Of the ten dimensions tested, four emerged as strong NS drivers: Perception, Emotion, Causality, and Attention. Their effects spanned language and association networks, including the auditory cortex, mid-temporal areas, inferior parietal lobule, inferior frontal gyrus, and pre-/post-central gyri. Two main patterns emerged: a rapid speaker-leading window (~-260 ms) across all region pairs, and a bidirectional pattern with speaker leads (-1000 ms, -250 ms) and listener leads (+200 ms) connecting frontocentral and tempoparietal networks. Dimension-specific profiles were evident, with Emotion showing speaker leads limited to bilateral frontal region pairs, while Causality and Attention exhibited unique speaker-listener bidirectional patterns. Classification analyses confirmed that these speaker-listener shared semantic features were represented through partly segregated neural "communication channels". These findings support theories positioning human communication within embodied and socially adaptive representations that leverage distinct pathways and timing to transfer information between minds.
Topic Areas: Speech Perception, Computational Approaches