What you see is not always what you get. And that, researchers at The Rockefeller University have discovered, is a good thing.
"Every time you move your eye, the whole world moves on your retina," says Gaby Maimon, head of the Laboratory of Integrative Brain Function. "But you don't perceive an earthquake happening several times a second."
That's because the brain can tell if visual motion is self-generated, canceling out information that would otherwise make us feel--and act--as if the world was whirling around us. It's an astonishing bit of neural computation--one that Maimon and his team are attempting to decode in fruit flies. And the results of their most recent investigations, published in Cell on January 5, provide fresh insights into how the brain processes visual information to control behavior.
Each time you shift your gaze (and you do so several times a second), the brain sends a command to the eyes to move. But a copy of that command is issued internally to the brain's own visual system, as well.
This allows the brain to predict that it is about to receive a flood of visual information resulting from the body's own movement--and to compensate for it by suppressing or enhancing the activity of particular neurons.
The human brain contains approximately 80 billion neurons, however, complicating the task of determining precisely how it makes such predictions and alters our perception at the cellular level.
Fortunately, the common fruit fly performs the same kinds of rapid eye movements. The mere 100,000 neurons in its poppy-seed sized brain must therefore handle the same problems of prediction and perception--but at a scale that Maimon and his colleagues, research associate Anmo Kim and postdoctoral fellow Lisa Fenk, can study in intimate detail.
There are differences between humans and flies, of course. For one thing, a fly's eyes are bolted to its head. To shift its gaze, it must therefore maneuver like a tiny airplane. And like an airplane, it can rotate around multiple axes, including yaw and roll.
Yet its brain still manages to distinguish between expected and unexpected visual motion.
When a gust of wind unexpectedly blows a fly off course, for example, a powerful reflex known as the optomotor response causes the insect's head to rotate in the opposite direction, snapping its eyes back toward their original target. The fly also stabilizes its flight path by using its wings to execute a counter-turn.
If a fly intentionally turns to shift its gaze, however, something different occurs. The urge to rotate its head and body back toward the original flight direction is somehow suppressed. Otherwise, it would never be able shift its gaze at all.
But how does a brain with such limited horsepower finesse such a complex problem?
In a previous study, Kim and Maimon demonstrated that two groups of motion-sensitive neurons in the fly's visual system are suppressed during rapid intentional turns, inhibiting the insect's behavioral responses.
In the Cell study, Kim, Fenk and Maimon showed that one of these sets of neurons stabilizes the head during flight turns. And they determined how it does so by measuring the electrical activity in individual neurons and filming the motions of the flies' heads and wings as they turned on purpose--or were tricked into believing that they had turned by accident. (In some of the experiments, the flies were glued to a miniscule platform and shown images on an LED screen that deceived them into thinking that their gaze had shifted unintentionally.)
Each of the neurons in question could respond to visual motion around several axes. Some were more sensitive to yaw, however, and others to roll.
And that's where things got interesting.
During intentional turns, each neuron received a signal that was carefully calibrated to suppress sensitivity to visual motion along the yaw axis alone.
Neurons that were more sensitive to yaw got a stronger countervailing signal. Neurons that were less sensitive got a weaker one. Sensitivity to roll, meanwhile, was left unimpaired.
As Maimon explains, this makes sense because flies must first roll and then counter-roll to properly execute intentional turns. If they were to counter-yaw, however, they would never be able to head off in a new direction.
The neural silencing process described by the researchers therefore left the flies selectively blind to visual information that would otherwise have interfered with their ability to turn--a feat of neural computation that Maimon likens to tuning out the sound of a single instrument in an entire band.
It's the first illustration of how brains can subtract just one component of a complex sensory signal carried by an entire population of neurons while leaving other signals in the same population untouched. And it provides a blueprint for understanding how the brains of larger creatures might manage the same kinds of problems.
For while the details of how the brain modulates visual perception might differ in animals whose skulls are packed with more neurons, says Maimon, "we would expect to see similar processes in mammalian brains--including our own."
###
Journal
Cell