A team of engineers and psychologists at the University of Texas at Austin has received $1.2 million from the National Science Foundation to develop a visual search system capable of finding objects in cluttered environments.
Dr. Alan C. Bovik, electrical engineering professor, and the project’s other principal investigators, Drs. Larry Cormack, Bill Geisler and Eyal Seidemann, all psychology professors at The University of Texas at Austin, will research the methods used by humans to search for objects to understand the process used by the human eye.
The group will then create mathematical algorithms that will allow computer vision systems to perform a visual search much like human beings do.
One part of their research will involve the development of a camera gaze control device. Attached to a camera, the device would ‘look around’ as a human does, essentially mimicking a human visual system where the brain uses information gleaned from peripheral vision to decide where the eye should look next.
Bovik sees the results of his research having helpful applications in medical diagnostics. He says, for example, that most doctors miss about 10 percent of breast tumours appearing on a mammogram. However, a few physicians consistently identify the most elusive tumours.
By documenting the methods used by these elite few, he hopes to create a machine to serve as a ‘physician’s assistant’ which scans mammograms initially, then shows the result to the doctor as an additional cue.