News Release

Linguist tunes in to pitch processing in brain

Peer-Reviewed Publication

Purdue University

WEST LAFAYETTE, Ind. - More of the brain is busy processing pitch from language and other sounds than previously thought, according to a researcher in neurophonetics at Purdue University.

"By studying brain activity at different stages of processing pitch patterns in tonal languages, we have found that early activity in the brainstem is shaped by a person's language experience, even while the person is asleep, and consequently, we now believe it plays a much greater role in speech perception that we thought before," said linguistics professor Jackson T. Gandour.

Gandour is presenting information from several of his pitch processing studies at the Feb. 16 "Brain Basis of Speech" session during the American Association for Advancement of Science's annual meeting.

"Everyone has a brainstem, but it's tuned differently depending on what sounds are behaviorally relevant to a person, for example, the sounds of his or her mother tongue," Gandour said.

The brain stem is located early along the auditory pathway, about 7-9 milliseconds from the time the auditory signal enters the ear. This is near where pitch processing begins in the cochlea and the auditory nerve, about 0-2 milliseconds.

"We now know that there are regions of the brain involved in processing the sounds of language that we didn't know about before," he said. "We know even less about how pitch information is analyzed, transformed and represented at different levels of the brain in the translation from sound to meaning. A fuller understanding will give us a better idea what roles the brain regions are playing, and this information could help people with communication disorders or brain injuries."

Gandour collaborated with Purdue auditory electrophysiologist Ananthanarayan Ravi Krishnan on the brainstem studies, which compared electrical activity in young adult speakers of the tonal language Mandarin with those of speakers of English, a non-tonal language. The majority of languages of the world are tone languages. They use inflections of pitch on syllables to indicate a difference between words. For example, in Mandarin the sound "ma" with a level tone means "mother," a rising tone means "hemp," a falling-rising tone means "horse" and a falling tone means "scold."

"Never did I expect we would find that language experience would shape the way the brainstem works," Gandour said. "The idea is that this sensory signal undergoes a set of transformations that are far more complicated than we originally thought. I feel like we have broken new ground and that we have just begun to go down a new avenue of research."

Gandour also collaborated with Purdue biomedical engineer Thomas Talavage as well as colleagues at the Indiana University Medical Center to apply the functional brain imaging techniques positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) to display blood flow activity at the level of the cerebral cortex.

These data reveal that melody of speech is processed in neither a single region nor a specific hemisphere, but engages multiple areas comprising large-scale networks that involve both hemispheres.

"And moreover, we find that these networks are not circumscribed to language processes, but instead interact with more general sensory-motor and cognitive process in addition to those associated with language," he said.

Gandour and his colleagues have shown that when the melody of speech is processed there is a dynamic interplay between the left and right hemispheres of the brain. The processing pitch of information engages neural mechanisms in the brain's right hemisphere, while left hemisphere regions mediate processing of linguistic information, he said.

Gandour compares the evolution of his research program on brain and language to that of someone trying to figure out the structure and function of different parts of a house. The view through the attic window shows theories about elements, rules and representations of language, minus the brain. Moving down to the second floor offers the first look at the neurobiology of language. Scientists on this floor assess deficits in patients' language abilities that result from damage to one or the other side of the brain to determine what areas are necessary for normal language functioning.

"While the windows on the second floor are important, it's only when we get to the first floor that we begin to see actual brain activity," Gandour said. "By using brain imaging techniques, we can view activity in both hemispheres simultaneously while subjects are performing language tasks, telling us what areas on either side of the brain participate in language functions in the normal human brain.

"That leaves the cellar, and what do you find in the cellar" In a house, it's fine wine. In a human, it's the midbrain. That's where we tune our fine tones. And just as fine wines take time, so too does it take time for our brain to construct fine tones."

Gandour's studies on neurophonetics span nearly three decades: brain lesion deficits, 1979-2000; functional neuroimaging, 1998-present; and auditory electrophysiology, 2004-present.

###

Note to Journalists: Jackson T. Gandour's talk is part of the "Brain Basis of Speech" session from 10:30 a.m. to noon on Feb. 16 at the annual American Association for the Advancement of Science (AAAS) meeting. This session is at the Hynes Convention Center, Third Level, Room 306. The title of his talk is "Tone Languages: Neural Basis of Pitch Processing from a Tone-Language Perspective." Journalists interested in the 2005 and 2006 journal articles mentioned in this news release can contact Amy Patterson Neubert, Purdue News Service, (765) 494-9723, apatterson@purdue.edu

His research is supported by the National Institutes of Health. He is a professor in the College of Liberal Arts' Department of Speech, Language and Hearing Sciences, and a faculty member in the Program in Linguistics and the Purdue University Interdisciplinary Life Sciences Program.

Related Web sites:
Department of Speech, Language and Hearing Sciences: http://www.cla.purdue.edu/slhs/
Jackson T. Gandour: http://myprofile.cos.com/gandourj59
AAAS: http://www.aaas.org/meetings/Annual_Meeting/

PRESENTATION ABSTRACT

Tone Languages: Neural Basis of Pitch Processing from a Tone-Language Perspective
Jackson T. Gandour

Tone languages represent the majority of spoken languages in the world. Such languages exploit variations in pitch at the syllable level to signal differences in word meaning. Importantly, pitch itself is a multidimensional auditory attribute. Though there is abundant evidence for a role of right hemisphere cortical networks in the processing of pitch information, the question arises how linguistically-relevant pitch patterns are processed not only at the cortical level but also at earlier subcortical levels along the auditory pathway. From a spatiotemporal perspective, we have carried out functional neuroimaging (PET, fMRI) and electrophysiological (MMN, FFR) experiments to investigate the influence of selected variables on pitch processing: language type (native, nonnative; tonal, nontonal); stimulus context (speech, nonspeech); cognitive domain (music, language); levels of linguistic representation (syntax: word, sentence; phonology: categories, features), and acoustic dimensions. Our findings show that language-, domain-, category-, feature-, or dimension-specificity of neural circuitry subserving pitch is experience-dependent relative to given stages of processing. For example, pitch analysis at early stages of processing in the brainstem is sensitive to linguistically-relevant dimensions, but not specific to speech contexts or domain of experience. At later stages of processing, the neural circuitry underlying linguistically-relevant pitch is highly sensitive to phonetic features or categorical representations that are differentially weighted depending on a listener's language experience, stimulus characteristics, and task demands. More broadly, these findings support the view that representations of language emerge over time from general sensory-motor and cognitive processes, in addition to those from language, within a framework involving a series of neural computations that apply to representations at different stages of processing.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.