News Release

Brainy cameras

Peer-Reviewed Publication

Office of Naval Research

In about half a second, the human brain (specifically the superior colliculus) will analyze its current environment, and then decide whether or not one thing or another is worth taking any notice of. Exactly how the brain does this is still somewhat a mystery, but we do know that the more sensory input provided, the more likely the brain will pay attention. (For example, in a crowd, if you wave at someone he may or may not notice…but, if you wave and shout, chances are better that he’ll pay attention.)

Researchers are now hard at work building computer programs that can function in the same way. Now there’s a camera in the works that uses a computer simulation of this specific brain process, and is close to mimicking it.

Funded by the Office of Naval Research, researchers at the University of Illinois have built a movable video camera that is aimed at targets detected by a stationary video camera that watches for motion, and a microphone pair that listens for sound.

These in turn are linked to a standard desktop computer that has been programmed with a simulated neural network. This neural network mimics how the human brain’s superior colliculus does its mental mapping, and then uses sound and sight to put in all in perspective.

“Gathering sensory input, processing it, and deciding what to do on the basis of that processing is an important part of real brain function,” says researcher Tom Anastasio. “So is learning to make better decisions. Although it is extremely simple, the Self-Aiming Camera operates in a brain-like way.” Theoretically, the system continually ‘learns’ – writing and re-writing its software code as it gathers more and more data.

“This ‘learning’ provides the camera with several useful abilities including discrimination – between a man or an automobile, for example, depending on whether it’s been programmed to look at men, or at automobiles,” explains ONR Program Manager Dr. Joel Davis.

The same system could be trained to fuse input from sonar, radar, infrared, mechanical and other detectors, in addition to the audio and video currently in use.

###

For more information on the Self-Aiming Camera, or to interview Davis or Anastasio if you are working media, please call Gail Cleere, 703-696-4987, or email cleereg@onr.navy.mil.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.