Cognitive and Linguistics Outcomes in Children with Cochlear Implants: Perspectives on Neural and Environmental Mechanisms.
Saturday, September 13, 3:00 - 5:00 pm, Elstad Auditorium
Organizer: David Corina1; 1University of California, Davis
Presenters: Nabin Koirala, Claire Monroy, David Corina, Shengyun Gu
Congenitally deaf children who receive cochlear implants early in life often make great strides in language acquisition, though the determinates of successful linguistic competencies remain poorly understood and a wide variability of language and cognitive outcomes remain. A core issue of what constitutes the “best practices” for raising deaf children has a long contentious history in the Deaf community, in education and in clinical practice. Community stakeholders and professionals who live, work and research in this dynamic intersectionality are often polarized and hold strong and entrenched ideological beliefs. This symposium presents novel research which explores the role of neural plasticity and language exposure as factors underlying observed outcomes. Against a backdrop of technologically driven audiological interventions, novel tools for structural and neuro-functional assessments of deaf children, and a broader understanding of the determinants of signed and spoken language acquisition (and associated behaviors, such as reading) this symposium aims to provide the audience with a balanced and thoughtful conversation.
Presentations
Neural characteristics influencing language and literacy outcomes in children with cochlear implant.
Nabin Koirala1; 1University of Connecticut, Storrs, USA.
Cochlear implantation (CI) in prelingually deafened children has been an effective intervention for language and reading development. However, a substantial subset of children with CIs continue to experience difficulties in these domains, necessitating a deeper understanding of the neural mechanisms underlying variability in outcomes. The talk will discuss findings from EEG and fNIRS data in 75 children where we focused on the neural substrates of language and reading abilities in children with CIs. Findings highlight a complex interplay between neural reorganization and language-reading outcomes in children with CIs. Crucially, early intervention and hearing age emerged as critical determinants of reading success, though early auditory deprivation still imposed lasting constraints.
Neural correlates of visual processing in deaf infants
Claire Monroy1; 1Keele University Staffordshire U.K.
Children with hearing loss demonstrate atypical performance in tests of nonverbal cognition, such as attention, working memory, and learning. These findings have led scientists to believe that hearing loss negatively affects cognitive development, in addition to causing potential delays in language development. However, the mechanisms underlying these differences remain unclear, and prior research has largely focused on older children rather than infants. In this talk, we will present work that investigates early cognitive development in deaf and hearing babies, focusing on visual habituation—a well-established measure of information processing and learning. In a behavioral study, we found that deaf infants habituate more slowly to a visual stimulus than hearing infants, suggesting that differences in core cognitive abilities emerge in infancy. They also suggest that cognitive differences between deaf and hearing infants can explain some of the variability in language outcomes in children with hearing loss. However, simply demonstrating a difference in performance does not allow us to infer the underlying brain mechanisms that cause the difference. In this case, we still do not know whether the difference reflects a deficit or whether it could reflect an adaptation of the deaf infants to their unusual sensory environment. To address this, we are currently using electroencephalography (EEG) to test competing hypotheses about the mechanism driving the previously observed differences in visual habituation. We will present preliminary findings from this work and discuss how they may shed light on how hearing loss affects cognitive development in young infants.
Use of electrophysiology to assess spoken language processing in cochlear implant using children
David Corina1; 1University of California, Davis
The use of electrophysiology indices, such as the auditory brainstem response (ABR), in the diagnosis of hearing loss in newborns is routine. However, the use of electrophysiological methods in the assessment of spoken language processing in cochlear implant (CI) using children is rare. We present electrophysiological data from experiments that document the time course of early auditory sensory processing and lexical recognition in early bilaterally implanted children (n = 70) and typically hearing (TH) controls (n = 66). The protocols include a novel passive speech processing task designed to elicit cortical auditory evoked potential (CAEPs) and an audio-visual lexical-semantic priming paradigm. Significant between group differences are observed in the amplitude of the P1 and N2 CAEPs. While only TH children showed responsivity to phonological information as indexed by a P300 component, largely equivalent results are observed for later occurring N400 components, indicating similar lexical-semantic processing in CI using and TH children. These data demonstrate the utility of ERPs techniques in understanding similarities and differences in acoustic and spoken language processing in congenitally deaf CI-using children and may lead to more targeted interventions.
The Family language project
Shengyun Gu1, Diane Lillo-Martin2; 1Occidental College, 2University of Conneticut
The Family ASL project investigates language development by preschool age deaf and hard-of-hearing children whose hearing parents have chosen to learn American Sign Language and include it in their family’s language plan. For those children with cochlear implants, this means that they are developing as Bimodal bilinguals, using both spoken English and ASL. We are following their development in both languages, and supporting their ASL through the services of an ASL specialist, who meets with them regularly online. There is variability across participants in how much they use ASL or English, but our assessments have revealed that overall, they are all making progress in both languages. In this presentation, we focus on their phonological development in both ASL and English. For ASL, we are using a self-generated elicitation task (ASL-Phonological Elicitation Task), with scoring at the feature and parameter levels. For English, we use the Goldman-Fristoe Test of Articulation, with phonemic scoring. Results indicate age-appropriate growth in both measures.
Panel Discussion
Mathew Dye1, Matthew Hall2, Melissa Malzkuhn3; 1Rochester Institute of Technology, 2Temple University, 3Gallaudet University