Search Abstracts | Symposia | Slide Sessions | Poster Sessions
The influence of American Sign Language on neural dynamics and spatial cognition
Poster Session E, Sunday, September 14, 11:00 am - 12:30 pm, Field House
This poster is part of the Sandbox Series.
Melody Schwenk1, Lorna C. Quandt1; 1Gallaudet University
Through repeated practice, visual-spatial language, such as American Sign Language (ASL), may reshape non-linguistic spatial cognition and its neural substrates. Prior research points to fluent signers outperforming non-signers on spatial tasks such as mental rotation, perspective-taking, and spatial attention, suggesting experience-dependent neuroplasticity. Previous EEG studies have reported preliminary evidence linking ASL fluency to increased mu-rhythm (8-13 Hz) suppression over the sensorimotor area. Such findings point to differences in motor processes across fluency gradients, possibly due to the embodied motor processing inherent in sign language. However, the precise behavioral and neural mechanisms that drive these changes remain underexplored, particularly in those with diverse linguistic experiences. We propose a 2.5-hour behavioral and EEG study to more clearly examine the role of ASL fluency, measured via the ASL Comprehension Test (ASLCT), on spatial cognition and the underlying neural mechanisms. Specifically, we hypothesize that: (1) greater ASL fluency will predict faster and more efficient performance on a 2D mental rotation task, reflecting stronger ability to envision and navigate their environment; (2) increased ASL proficiency will correlate with greater mu suppresion as a result of practice effects; (3) fluent signers will show fewer perspective-taking errors during spatial navigation EEG paradigm when compared to non-fluent and non-signers due to automaticity in switching during daily communications; and (4) greater ASL proficiency will be associated with increased frontal-parietla phase locking in theta and alphan bands, suggesting a a stronger spatial network integration. Sixty-five adults stratified by hearing status and ASL fluency (Deaf fluent, Deaf non-fluent, Hearing fluent, Hearing non-fluent, Hearing non-signers) will complete a computerized 2D mental‐rotation task (140 trials), the Virtual SILCton navigation task, and a 2×2 (Perspective × Difficulty) EEG navigation paradigm (135 trials across nine blocks). Planned analyses include mixed‐effects ANOVAs, hierarchical regressions controlling for age and English proficiency, moderation by hearing status, and bootstrap mediation of ASL proficiency’s effects on network-behavior relationships. Data collection will begin in Summer 2025, with preliminary results to be presented at the SNL conference. Findings will clarify how ASL’s spatial grammar may scaffold mental transformations and reorganize cortical networks, informing linguistic relativity and neuroplasticity theories.
Topic Areas: Signed Language and Gesture, Multisensory or Sensorimotor Integration