Researchers at San Diego State are studying differences between deaf and hearing people, examining topics such as the difference in general brain structure, sign language versus spoken language and areas of the brain that are involved.
Distinguished professor and Laboratory Director Karen Emmorey and her team of about 15 researchers are currently analyzing questions about these differences at the Laboratory for Language and Cognitive Neuroscience through the Clinical and Cognitive Neuroscience initiative.
“The main thing I am interested in is neuroplasticity,” Emmorey said. “How flexible the brain is when looking at sign versus spoken languages. What adaptations are there for sign language? What is invariant?”
She explained within the left hemisphere of the brain, there are important regions for spoken language located near the auditory cortex, the part of the brain that processes sensory information as sound, as well as near the cortex that controls the mouth and tongue.
“But what is interesting is that those same areas are also involved in sign language production and comprehension,” Emmorey said.
Emmorey said the brain activity of deaf and hearing people differs greatly.
“In sign language, when you have to talk about spatial information, when you are talking about where something is, or where you have to find something, you use the space … and that’s really different than prepositions,” Emmorey said.
She said this is when different regions of the brain coming into play.
“If you look at the auditory cortex, interestingly enough we are not seeing changes in the grey matter cells … the cortex is just as thick in auditory regions for deaf people as they are in hearing people,” Emmorey said. “We see differences in the white matter, the material that connects auditory cortex to other areas of the brain, the ‘communication highways of the brain;’ and we see a reduction of this material in deaf people compared to hearing people.”
Most of the research is conducted using Functional MRI technology, through a partnership SDSU has with University of California, San Diego and their imaging facility. Functional MRI is used to gather data of neural activity in different behaviors such as language comprehension. The team of researchers visits the facility about once or twice a month depending on whether they are scanning the subjects or at the lab analyzing the data.
The researchers are currently starting up a few new projects for the 2014 year, hoping to get funding.
One project analyzes how skilled deaf readers achieve their high reading abilities and examines their brain structure. Emmorey explained reading levels differ greatly amongst deaf people.
“Reading is difficult when you can’t hear and there is a huge range of reading abilities in deaf people from really skilled, college level, to maybe only the fifth grade level in adults,” Emmorey said. “We want to understand that variability, in particular, it may be the case that a skilled deaf reader [brain] may not look like a skilled hearing reader even though they have they same reading level. We want to understand what the target end state is for deaf readers…both what the neural systems that underlie reading are and how they read.”
Another project investigates hearing people are bilinguals in American Sign Language and English. The researchers are interested in the differences between these bilinguals and spoken-language bilinguals, for example, Spanish and English. While these speakers utilize both the vocal cords to produce the language and perceive the language auditorily, the same is not the case for bimodal bilinguals.
“My hands are used for sign and my tongue for speech and speech is perceived auditorily but sign is perceived visually,” Emmorey said. “How does that change the nature of bilingualism? One of the things that’s interesting is that for spoken language bilinguals, you have to turn one language off in order to use the other one, but bimodal bilinguals don’t have to.”
The researchers have coined the term “code-blend” to explain this ability in which “a word and a sign can be produced at the same time,” distinguishing from “code-switch” a term used when unimodal (spoken language) bilinguals switch from one language to another Emmorey said.
She emphasized the most important aspect of their research as a way of facilitating a broader understanding of sign languages.
“There are a lot of myths about sign languages−that there is a universal sign language, that they are simple pantomimes. Part of our research is simply dispelling those myths. You see that the same brain areas are involved in sign and spoken languages and that deaf people are able to do all kinds of things.”