Games without frontiers

Artificial-intelligence researchers are developing the technology behind computer games for applications in other sectors. Jon Excell reports

While the glassy-eyed adolescent may remain the gaming industry's most iconic consumer, there are increasing signs that the technology behind an industry that is worth around $10bn (£6bn) a year globally has vast potential beyond the walls of darkened teenage bedrooms.



Now, advances in both graphics and hardware are fuelling technology development in areas ranging from healthcare to cutting-edge robotics research.



One person at the cusp of this technology transfer is

Imperial College's

Murray Shanahan, who is using graphics processing technology originally developed for the gaming industry to push out the boundaries of artificial-intelligence (AI) research.



Shanahan, a professor in cognitive robotics, explained that gamers' demands for evermore realistic and impressive graphics have driven a corresponding increase in the number-crunching ability of the graphics processing units that deliver the gaming experience. 'We've had dedicated graphics cards in PCs for a couple of decades — these cards have been getting more and more sophisticated. A lot of that is driven by the demand for evermore sophisticated games and the rendering of very realistic graphics on the screen, which is what gamers want,' he said.



Increasingly, added Shanahan, researchers in other areas are cottoning on to the potential of these units and what began with a handful of people hacking into gaming systems has blossomed into a separate industry, with manufacturers such as NVIDIA now producing specialised units for application outside the gaming industry.



'What they're built for is to do very fast processing of numbers in parallel,' he explained. 'So if you want to do the same kind of transformation on a large number of vertices in a picture, then that's what a graphics card is very good at doing. It will take the representation of all those points on a screen and will do a transformation that will enable you to move something from one place to another on your graphics space. That kind of computation is exactly the kind of computation that people want to do for many other kinds of application.'



In Shanahan's case, the technology is being used in the development of large-scale neural networks that replicate the way biological brains produce intelligence. 'We're interested in simulating large numbers of neurons,' he said. 'If we ever wanted to make a robot move around using something that was a simulation of a brain, we would need to make millions of neurons work in real time. If you've got a robot that needs to respond to something right in front of it, you want them to work as fast as real neurons.'



The scale of the challenge is a sobering reminder of nature's achievements. While human brains have around 100 billion neurons, Shanahan's team is currently up to around 100,000, which is roughly equivalent to an ant.



Nevertheless, he believes that biologically inspired AI will bring about the next big advances in machine intelligence.



'Using traditional AI techniques, we've pretty much reached a plateau of intelligence and it's hard to see how we're going to be able to move beyond that. That's the motivation for trying to go back to the way nature has done it and try to replicate the way brains do things,' said Shanahan.



Ultimately, he suggested that the lessons learned using technology borrowed from the graphics industry could find its way back into the world of gaming. 'The gaming industry itself is very interested in AI because it wants to give artificial players increasing intelligence,' said Shanahan. 'Already, it uses quite a lot of old-fashioned traditional AI techniques to do that, but, potentially, you can see that it would be interested in endowing them with more intelligence using the kind of technology that we're developing using the technology that it originally developed.'



Elsewhere, the flow of technology between the gaming industry and other sectors is more explicit.



One of Shanahan's Imperial colleagues, Dave Taylor, who heads up the Virtual Worlds and Medical Media programme, is investigating the application of virtual worlds such as Second Life to healthcare provision.



'Games technology and virtual worlds lend themselves to many applications in science and technology but also in healthcare — specifically in simulation visualisation training teams where people can practise in a safe environment,' he said.



In one trial, Taylor's team developed a virtual ward for a group of staff nurses at St Mary's Hospital in West London to enable them to practise and test their reactions to unexpected situations. Many nurses would no doubt baulk at the prospect of spending their already stretched time resources playing computer games, but, according to Taylor, the trial has gone well. 'The research we've done indicates that they find the situations realistic and are going through thought processes that they would go through in a real ward with a real patient — we think this would be a useful adjunct to training that is normally given on the use of medicine.'



Although some commentators have accused the government of wasting tax payers money on 'fantasy worlds', he is adamant that the technology has a useful role to play in the health service. 'Virtual worlds can be used by clinical staff to practise situations that don't occur very often but, when they do, you need to know how to manage them,' explained Taylor. 'Generally speaking, people have realised that in playing a game people learn an awful lot — if you can develop a game to impart factual information, then people will pick up that factual information.'



While Taylor identifies the training potential of gaming, Shanahan prefers to see the gaming industry as a useful research-and-development seed bed — a font of creativity that, because it is unconstrained by the critical tolerances found in automotive or healthcare industries, for instance, can afford to take risks. 'You don't need to make the leap all the way to something that is guaranteed,' he said. 'The technology can develop more quickly — developers can take greater risks.' Ultimately, he believes, this could benefit all of us.