Search Abstracts | Symposia | Slide Sessions | Poster Sessions
Auditory experiences impact semantic processing
Poster Session D, Saturday, September 13, 5:00 - 6:30 pm, Field House
Benjamin Miller-Mills1, Sofronia M. Ringold1, Tom Williamson3, Lloyd May2, Caitlin Nguyen1, Nicole Caballa1, Grace Olado1, Assal Habibi1, Ray L. Goldsworthy1, Lisa Aziz-Zadeh1; 1University of Southern California, 2Stanford University, 3University of the West of England
Prior studies indicate that visual imagery ability interacts with semantic processing. However, similar studies considering auditory imagery are rare, as are studies focusing on auditory expertise (such as musical ability). Here we look at the relationship between individual differences in auditory imagery and musical experience along with the ability to comprehend novel and conventional sound-related metaphors, as well as sound-word associations. Participants (N=76; age 18-40; 51=Female) completed online surveys, including the Bucknell Auditory Imagery Scale (BAIS) and the Frequency of Musical Perception subscale from the Music-Related Quality of Life questionnaire (MuRQoL). Participants also completed a novel word-association scale, where they had to rate along a 7-point Likert scale the perceived loudness of single words in isolation (e.g., “banana”, “stomp”, “love”). Finally, participants completed a multiple-choice questionnaire where they had to mark the meaning of novel and conventional sound-related metaphors, as well as control (color-related) metaphors. Statistical relationships were determined using Pearson correlation coefficients and student t-tests. Overall, associations from emotion-words (Mean=5.76±1.02) were perceived as significantly louder than associations from color-words (Mean=2.83±0.92, p<.001); and these same emotion-word responses were not significantly different from commonly loud sound-related words (e.g., “roar”, “thunder”; p=.56). As predicted, the perceived loudness of sound-related words was also influenced by individual differences in auditory imagery, as indicated by a significant positive correlation with Vividness scores from the BAIS. Specifically, individuals with more vivid auditory imagery were also more likely to report louder sound-related word-associations (r(74)=.34, p=.003). Further, there was a significant positive correlation between the accurate assessment of novel sound-related metaphors and individual differences in the vividness of auditory imagery. Specifically, individuals with more vivid auditory imagery were increasingly accurate with sound-related novel metaphors (r(74)=.26, p=.02), but not sound-related conventional metaphors. Similarly, Musical Perception Frequency scores yielded a significant positive correlation with novel sound-related metaphor accuracy (r(74)=.26, p=.02), but not conventional sound-related metaphors (p=.39). These data indicate that individual differences in auditory imagery and auditory expertise (musical perception) impact one’s ability to process both novel auditory metaphors and individual sound-word associations. These data are consistent with the view that one’s sensory experiences (both lived and imagined) impact language processing. We are currently collecting data from hearing impaired and musician populations to better understand the effects of experience on semantic processing. Going forward, this behavioral paradigm may aid in the identification of cross-modal neural correlates involved in processing sound-related language, by providing salient stimuli for future MRI and EEG studies.
Topic Areas: Multisensory or Sensorimotor Integration, Meaning: Lexical Semantics