Technique enables computers to find visually similar images

Researchers at Carnegie Mellon University believe computers can mimic the human ability to find visually similar images by using a technique that analyses image uniqueness.

The team — led by Alexei Efros, associate professor of computer science and robotics, and Abhinav Gupta, assistant research professor of robotics — found that its technique works on a number of visual tasks that normally confuse computers, such as matching sketches of cars with photographs of cars.

Most computerised methods for matching images focus on similarities in shapes, colours and composition. However, these methods can fail when photographs are taken in different seasons or under different lighting conditions. In addition, different domains, such as photographs, colour paintings and black-and-white sketches can also prove difficult to match together.

The uniqueness of each image was determined by looking at a very large data set of randomly selected images. The team described features that are unique as those that best discriminate one image from the rest of the random images.

By using the technique, the team believes it may be possible in many cases to match historic photos with an existing online photo. Similarly, it thinks the method could be combined with large GPS-tagged photo collections to determine the location of where a landmark painting was done.

‘The language of a painting is different than the language of a photograph,’ Efros said in a statement. ‘Most computer methods latch onto the language, not on what’s being said.’

One problem, Gupta said, is that many images have strong elements, such as a cloud-filled sky, that may have superficial similarities to other images, but really only distract from what makes the image interesting to people. He and his collaborators hypothesised that it is instead the unique aspects of an image, in relation to other images being analysed, that sets it apart and it is those elements that should be used to match it with similar images.

On the pixel level, a photo of a garden statue in the summer or autumn will look very different to the same statue photographed in winter, said Abhinav Shrivastava, a master’s degree student in robotics and first author of a research paper describing the research. But the unique aspects of the statue will carry over from a summer image to a winter image, or from a colour photo to a sketch.