Search Abstracts | Symposia | Slide Sessions | Poster Sessions
Rethinking ERPs: Does sensory and linguistic experience shape ERP signatures in deaf signers?
Poster Session A, Friday, September 12, 11:00 am - 12:30 pm, Field House
Zed Sehyr1, Erin E Campbell2; 1Communication Sciences and Disorders, Chapman University, 2Wheelock College of Education & Human Development, Boston University
To what extent is visual and cognitive processing shaped by the languages we use and the modality through which those languages are experienced? Deaf signers offer a valuable test case for this question. Research has shown that deaf individuals exhibit distinct visual attention allocation, enhanced peripheral motion detection, and modified face processing strategies compared to hearing individuals (Allencar, Butler, & Lomber, 2019; Bavelier, Dye, & Hauser, 2006; Bottari et al., 2010). These differences suggest that core visual and cognitive processes may be fundamentally reorganized through experience with a visual-manual language and in the absence of auditory input. Canonical event-related potential (ERP) components, such as the P1, N170, N2pc, N400, P3, LRP, and ERN, have been extensively used to index key aspects of cognition, including early sensory encoding, attentional selection, semantic integration, and performance monitoring (e.g., Olivares et al., 2015; Kutas & Federmeier, 2011). However, little is known about how these components manifest in deaf signers, whose neural architecture is shaped by both deafness and lifelong use of American Sign Language (ASL). This study investigates whether the temporal dynamics and scalp distributions of these components are modulated by sensory and language experience. In a pre-registered, ongoing study, we employ the ERP CORE battery (Kappenman et al., 2021), a standardized set of paradigms designed to elicit well-validated ERP components with high signal quality. Participants complete six core tasks: face perception (N170), visual search (N2pc), visual oddball detection (P3), semantic word pair judgment (N400), simple response selection (LRP), and flanker-induced error monitoring (ERN). We supplement the battery with a novel ASL sign pair semantic judgment task, designed to elicit the N400 response to semantically related and unrelated ASL signs. By comparing semantic processing across English and ASL within the same bimodal bilingual individuals, we examine how language modality shapes the neural correlates of meaning. Participants to date include 5 adult deaf signers and 13 hearing non-signers, with a target sample size of 40 per group. All participants completed standardized written language and cognitive measures, and deaf participants additionally completed ASL proficiency measures, to identify effects of group, language proficiency, or age of language exposure on the ERP signatures. Preliminary data reveal subtle but theoretically meaningful group differences. Deaf participants show a more bilaterally distributed N170 response to visual objects such as faces, in contrast to the right-lateralized N170 observed in hearing participants at parietal-occipital sites. For the N400, both groups demonstrate robust effects of semantic relatedness to written words; however, deaf participants exhibit a slightly delayed and more posteriorly distributed N400 compared to their hearing peers. These emerging patterns suggest that sensory and language experience shape the timing and topography of core ERP components, raising critical questions about the modality neutrality of neural markers commonly used to index basic visual and cognitive processes.
Topic Areas: Multisensory or Sensorimotor Integration, Signed Language and Gesture