Search Abstracts | Symposia | Slide Sessions | Poster Sessions
How do iconicity and embodied cognition interact during sign processing?
Poster Session E, Sunday, September 14, 11:00 am - 12:30 pm, Field House
This poster is part of the Sandbox Series.
Meghan E. McGarry1, Lorna C. Quandt1; 1Gallaudet University
Most deaf/hard-of-hearing (DHH) children are born to hearing parents who do not know sign language, though their caregivers may wish to learn to provide their child with accessible communication. In fact, DHH children who receive accessible sign language input are protected from harmful academic and health effects of linguistic deprivation. However, caregivers who wish to learn sign language require support to effectively and quickly acquire a new language (in a new modality). One difference between signed and spoken language is the use of the visual-motor modality, which facilitates a mapping between the form of the sign and its meaning. Called iconicity, this mapping allows visual elements of the world to be represented in a way that isn’t possible in spoken language and some research has suggested a faciliatory effect of iconicity in both first and second-language vocabulary acquisition. One potential avenue for this facilitation is embodied cognition, as research shows that both child and adult language processing involves accessing stored knowledge from past sensory-motoric experiences. Motorically-iconic signs have an iconic mapping that depict the singer manipulating the referent (e.g., the ASL sign HAMMER depicts the signer holding and swinging a hammer). This visual-motor mapping could facilitate signers accessing their embodied cognition. Though caregivers of DHH children may not know how to sign, they already have these stored sensory-motoric experiences that they could tap into during language learning and as they explore the visual-motor modality. In this study, we will explore whether signers engage their stored motor experiences when producing and comprehending motorically-iconic signs in a production (picture-naming) and comprehension (sign-identifying) task. Behavioral and EEG results will be compared between a set of motorically-iconic and non-iconic signs. EEG will be used for Event-Related Potential (N400 component) and time-frequency analysis of the mu rhythm. The N400 is an ERP component that indexes semantic processing, and if motoric iconicity facilitates sign retrieval we expect to see smaller N400s for motorically-iconic signs, indicating reduced effort during sign retrieval. Mu rhythm desynchronization occurs during embodied engagement of the sensorimotor cortex. If signers access their past experiences during sign processing, we expect to see greater desynchronization for motorically-iconic signs. We don’t expect to see effects for non-signers doing the same tasks in English, as there is no difference in iconicity between the English translations. Ultimately we will conduct the same experiments in caregivers of DHH children. These caregivers will be taught the same ASL signs (but will otherwise have no knowledge of ASL). If we find similar results to those for the native signers, these findings would suggest that the caregivers integrated iconicity and embodied cognition to support their learning, and that motorically-iconic signs may be a starting place when caregivers are first acquiring language in the visual-motor modality. As sign acquisition is the best protection from linguistic deprivation for DHH children, interventions that support caregiver learning ultimately improve communication with the DHH child, providing a foundation for language development and protection against the lifelong effects of linguistic deprivation.
Topic Areas: Signed Language and Gesture, Language Development/Acquisition