Along with several other research centres around the world, the National Industrial Centre for Virtual Environments (NICVE) at the University of Salford is attempting to combine the disciplines of Virtual Reality (VR) with computer-aided design (CAD) and artificial intelligence (AI). The aim, according to Ruth Aylett of NICVE is to create a fully-immersive experience in which objects can move and be moved in realistic ways. In such an environment, users can interact with other entities. These are known as `agents’ when they are computer-generated, and `avatars’ when they represent another human user. `We are trying to introduce behavioural realism, as well as physical realism,’ Aylett says.
The combination of CAD with VR is the key to creating `working models’ of devices and machines in cyberspace, says Aylett’s colleague, Terrance Fernando. The key to this is a technique known as collision detection, which stops separate objects from passing through each other and ensures that, when they meet, they behave like real objects. For example, a square object on a flat surface can only slide in two dimensions, and is constrained by friction. A square object in a square hole can move up and down; a cylinder in a round hole can pivot as well. These relationships between two solids are known as `kinematic pairs’, and it’s by describing the components of a machine in these terms that realistic models can be built. `The system maintains the allowable relationships between the components,’ explains Fernando. The software controlling this – the `constraint manager’ – is interfaced with the CAD system (which contains the technical information about the components and how they fit together) and the VR system (responsible for the display) to produce the finished product.
This is in itself a useful tool to train engineers on the maintenance of machinery, but they still need an instructor. This, says Aylett, is where artificial intelligence comes in. It allows the introduction of autonomous agents into VR. `They aren’t like animations, which are entirely pre-scripted,’ she explains. `They have an agenda and a series of rules which they obey, but within that, they can respond to the demands of the trainee.’
AGENTS IN ACTION
One example of this has been developed by the University of California, San Diego. It is currently being used by the US Navy to teach engineers the maintenance programme for ship-board gas turbines. Dubbed STEVE (for SOAR Training Environment for Virtual Engineering), the agent within the programme is not quite human-looking – in fact, he’s a legless, floating blue torso the shape of a pepper, with a blond head on top and a disembodied hand (the picture here, with arms, is a version under development).