Search Abstracts | Symposia | Slide Sessions | Poster Sessions
Tracking Learning: Eye-Gaze and Pupillary Responses Reveal How Language and Cognitive Control Influences Visual Statistical Learning
Poster Session D, Saturday, September 13, 5:00 - 6:30 pm, Field House
This poster is part of the Sandbox Series.
Jenna DiStefano1,2, Rachel Foster1,2, Yuko Munakata1,2, David Corina1,2, Katharine Graf Estes1,2; 1Department of Psychology, University of California, Davis, 2Center for Mind and Brain, University of California, Davis
Statistical learning (SL) is a domain-general, implicit learning strategy that children employ to track statistical dependencies in their environments to learn language and understand the visual world (Fiser & Aslin, 2002; Saffran et al., 1996). Even though SL is thought to be established in infancy and remains stable, some work argues that access to sound is a determinant for SL since deaf children perform worse on SL tasks; this view is central to the auditory scaffolding hypothesis (Conway et al., 2009). Other results have shown that deaf children with native American Sign Language (ASL) experience do not show deficits in SL, providing evidence for the language scaffolding hypothesis (Hall et al., 2017). Eye-tracking, especially anticipatory eye-gaze, offers a promising way to examine this relationship. For example, Monroy et al. (2017) demonstrated that toddlers made predictive eye movements while viewing action sequences, reflecting learning of statistical dependencies. Yet, little is known about how a visual language like ASL might influence such behaviors. Another factor potentially impacting SL is cognitive control. Studies have reported mixed findings, with some showing a negative association between cognitive control and SL (Park et al., 2020) and others suggesting a supportive link (Smalle et al., 2022). Pupil dilation response (PDR), a physiological marker of cognitive effort, may help clarify the role of cognitive control in SL (Zhang et al., 2018). This work asks two questions: 1) How does language experience, such as ASL use, shape looking behaviors during SL? 2) What can the PDR reveal about the relationship between SL and cognitive control in deaf children? This study is part of a larger project investigating auditory and visual SL in deaf and hearing children aged 4–7. Children completed four tasks: a visual SL task, an auditory SL task, a cognitive control task, and a receptive vocabulary test. The present analysis will focus on two of these tasks. The novel visual SL task presented cartoon images in consistent triplet sequences and fixed screen locations. Participants then completed 24 2AFC test trials. Cognitive control was assessed using a child-friendly AX-CPT. We will measure anticipatory eye-gaze as a proportion of correct fixations to the correct location out of the six possible regions during the learning trials. For pupillometric data, we will calculate the PDR as the percentage of pupil diameter change from baseline (Lavin et al., 2014). Data collection is ongoing. Preliminary results show that hearing children (n = 70, mean age = 6.03) scored 58% accuracy on the visual SL test, above chance (50%). Webcam-based eye-tracking data from 13 children revealed increasing anticipatory gazes over time (r = 0.43, p < .05). We predict: 1) Children with early native language experience, including ASL, will show more accurate and faster anticipatory gaze latencies in visual SL; 2) Cognitive effort during the AX-CPT, as shown by increased PDR to the A probe, will be related to decreased learning during SL; 3) ASL users will show a faster decrease in PDR during visual SL, reflecting more efficient processing of spatio-temporal information.
Topic Areas: Language Development/Acquisition, Control, Selection, and Executive Processes