Using spatial illusion to learn how the brain processes sound

June 16, 1999

EDITORS: A color photograph of U-M scientist Li Xu in the U-M experimental sound chamber is available on request.
ANN ARBOR—Next time your pager starts beeping in a crowded room, try this little experiment in auditory perception. After a few beeps, notice how everyone starts looking around in all directions trying to hear where the noise is coming from. Try the same experiment in a room full of cats and you’ll see the feline version of aural confusion.
People and cats have no problem localizing natural sounds like a snapping twig or rustling leaves, which include a broad spectrum of sound frequencies, according to John C. Middlebrooks, Ph.D., an associate professor of otolaryngology in the University of Michigan Medical School. But our auditory system lacks the ability to pinpoint the location of narrow-band sounds with just a few frequencies, like a beeping pager.
Middlebrooks and his colleagues at the U-M Kresge Hearing Research Institute are taking advantage of this inability to localize narrow-band frequencies in research designed to learn how the brain processes and perceives sound.
“We know that sound is recorded in the firing pattern of neurons in the auditory cortex—the part of the brain that processes electrical signals generated in the inner ear,” Middlebrooks said. “We’re trying to break the code—to understand the rules the brain uses to translate this neural activity into what we hear as sound.”
In a paper published in the June 17 issue of Nature, U-M scientists Middlebrooks and U-M post-doctoral researchers Li Xu, Ph.D., and Shigeto Furukawa, Ph.D., describe how localization errors made by nerve cells in the brains of cats exposed to filtered sounds are consistent with errors made by humans in previous experiments.
In earlier experiments, human volunteers stood in a soundproof room surrounded by 14 loudspeakers and listened to a random series of broad-band and narrow-band tones, which sound something like quiet crickets. People turned toward each sound’s origin, while sensors recorded the orientation of their heads when they did so. Consistently, volunteers listening to narrow-band sounds turned toward locations that differed in predictable ways from the actual loudspeaker.
For experiments described in the Nature paper, U-M scientists played the same sounds for anesthesized cats with miniature probes surgically implanted in their auditory cortex. Created at the U-M Center for Neural Communication Technology, these neural probes are the size of a grain of pepper and sensitive enough to record signals from a single nerve cell. Using the microelectrode probes, U-M researchers recorded electrical activity from individual neurons in the cat’s auditory cortex as it heard the sounds.
“With the probes, we can record from the neuron directly,” said Xu. “In effect, the neuron tells us where the cat believes the sound is coming from.”
“The auditory systems in humans and cats appear to use the same spectral sound characteristics to determine sound locations,” Middlebrooks said. “We interpret these results as evidence that the firing pattern we see in cat neurons could be a model for brain processes that underlie spatial perception reported by humans exposed to the same sounds.”
If future research confirms that neurons in human brains respond in the same way as those in the brains of cats, Xu added, it will have immediate applications for the development of new implantable hearing devices designed to stimulate the auditory system directly.
The U-M research project is funded by the National Institute for Deafness and Other Communicative Disorders of the National Institutes of Health. The U-M Center for Neural Communication Technology is supported by NIH’s National Institute for Research Resources.

John C. MiddlebrooksKresge Hearing Research InstituteNatureCenter for Neural Communication TechnologyNational Institute for Deafness and Other Communicative Disorders