Magic touch

Advanced tactile interfaces that enable users to ‘feel’ the digital world could take our relationship with computers to a new level. Jon Excell reports

Gary Todd’s technology makes people feel sick. While this may sound like a strange reason for celebration, Todd has good reason to be pleased. His invention, a simulator for training medical workers in venepuncture — sticking a needle in a vein — is said to look and feel so much like the real thing that the NHS is poised to make it a key training tool. It has even induced nausea in some squeamish students.

Todd’s system, Virtual Veins, is just one example of a range of emerging technologies (from advanced touch screens to robot exoskeletons that enable wearers to become fully immersed in a virtual environment) that promise to bridge the gap between the digital world and reality, and regain a vital human touch in our dealings with computers.

The systems under the spotlight belong to the emerging field of haptic technology — from the Greek for touch — the science of applying touch and control to interaction with computers.

Most of us are familiar with a primitive version of the concept in the form of the tiny motors that make our mobile phones vibrate. But the internet is alive with rumours that everyone from Apple to RIM (the manufacturer of the BlackBerry) is poised to take the technology to the next level with advanced touch screens that fool the user into feeling a range of sensations.

Most recently, mobile phone giant Nokia demonstrated a haptic touch screen on one of its handheld internet browsers. Engineers at the company’s Helsinki research centre inserted piezoelectric actuators behind the touch screen that can be optimised and controlled to produce vibrations that mimic a range of tactile sensations, such as pushing a button, or flicking a switch.

Roope Takala, who heads the Haptikos project, believes there is a compelling case for haptic touchscreens but cites a two to five-year gap between research and development and final product, and was tight-lipped on when Nokia’s screen will make its commercial debut.

Another person convinced that tactile feedback is going to be huge is Christophe Ramstein, research chief at Californian Haptics pioneer Immersion Corporation. ‘Haptics is stepping out of the laboratory,’ said Ramstein. ‘It’s much more than just prototypes and concepts. Today people are increasingly seeing haptics as something that adds value.’

Immersion, originally spun out of research at Stanford University, offers a range of technologies. An early system that creates basic sensations by optimising the vibrations of the tiny motor present on most cell-phones has been licensed to LG and Samsung. More advanced techniques which, like Nokia’s technology, place piezoelectric actuators behind the screen, are also poised to appear on a variety of products.

Ramstein believes the stage has been set with the sophisticated, multi-touch sensing screens on devices such as Apple’s iPhone. He says some form of tactile feedback is the logical next step.

While he would not comment on rumours that Apple is interested in licensing Immersion’s technology for future versions of the iPhone, Ramstein believes the iconic device could act as a catalyst for the use of haptic technology.

‘The iPhone has been a trigger in terms of showing the market that touch screens can be used in a very clever way. Apple has done a really good job of using sensing technology and combining that with a compelling user interface. But the one thing that users are losing compared to mechanical keyboards is the feel of the keys. Not having tactile feedback is a real problem,’ he said.

In the longer term, Ramstein thinks haptic touch screens will go way beyond simply replicating the feel of buttons. ‘In 10 years from now I predict that we are going to see much more effects. Mechanical switches are one thing, but we can begin to think about more sophisticated effects like adding vibrations to music as if you’re at a concert.’

And applications of tactile touch screens are not limited to handheld devices. In the UK, a team of engineers at the Warwick Manufacturing Group(WMG) is investigating the potential of tactile feedback devices within vehicles.

 


Nokia’s Haptikos touch screen handheld web browser demonstrator (above left) and Immersion’s technology, which mimics the ‘feel’ of real buttons (above right)

Mark Williams, whose group is integrating Immersion’s touch-screen technology into an advanced driving simulator, said haptics offers a solution to the growing problem of drivers becoming overwhelmed by onboard technology. ‘Consumers are demanding more and more functions and complexity but they want it simpler and more intuitive, so companies are really looking to these types of solutions to help manage that level of complexity.’

He claimed tactile interfaces could also help ensure automotive technology does not compromise road safety. ‘If you can feel that you’ve activated something, there’s no need for a visual check and you can therefore keep your eyes on the road.’

The main purpose of the WMG project, which Williams said involves a number of big OEMS and an unnamed large Indian auto manufacturer, is to evaluate what kinds of haptic feedback will be most effective.

‘Through varying amplitude and frequency we can develop a whole range of different types of feedback,’ he said. ‘We can replicate switches so that they feel like a real switch, we can create unpleasant vibrations that travel up the arm. We can adapt the simulation environment and change the type of feedback in order to assess which is the most appropriate. We can then assess driver performance alongside these different types of feedback.’

Nokia’s Roope Takala agreed that automotive applications are promising and suggested they may be simpler to develop than handheld devices.

‘A car implementation would be much easier because of the high voltage in the car battery, and the room in the control panels means that miniaturisation is not so critical.

‘Technically it would be much easier to implement than mobile devices. The main challenge is that the car is vibrating and the displays are further from the user, so you would require stronger feedback.’

But fooling the fingertips is only half of the haptic story. A number of companies, including Immersion, are developing wearable ‘force-feedback’ systems that allow the muscles of the body to feel the virtual world.

These have a number of compelling applications, not least in the gaming industry, where the prospect of enabling gamers to fully participate in a virtual world remains something of a Holy Grail.

But the cutting-edge stuff is literally at the cutting edge, in the form of systems developed to carry out surgical training, partly because the increasing popularity of minimally invasive surgery, such as laparoscopies and endoscopies, demands a new approach.

One company particularly active in this market is US firm SensAble Technologies. Spun out of tele-robotics research at the Massachusetts Institute of Technology (MIT), SensAble’s core technology is a robotic arm, typically mounted on a desk, that enables users to manipulate and feel virtual objects.

Users grasp a stylus, or end-effector, mounted on the arm and use it to interact with a virtual environment. As they interact with this environment specially developed software causes actuators within the arm to generate the forces of the virtual world and make the user feel as if they are working with something tangible.

David Chen, SensAble’s chief technology officer, said while the technology has applications in high-end surgical training — it is used by Toltech, a Colorado-based developer of a knee arthroscopy training simulator — it is also ideal for training in techniques more commonly performed in an emergency setting.

For instance, Northumbrian firm UK Haptics has incorporated SensAble’s technology into the Virtual Veins simulator.

Gary Todd, managing director of UK Haptics, said users sit in front of a PC wearing 3D goggles and use the end-effector on the SensAble device to interact with, for instance, a 3D model of part of the body.

A typical use is training health workers in needle placement for dialysis patients.

‘Before dialysis, patients have an operation that joins the vein to the artery with an artificial sump so that there’s a very big area and you know you’re going to hit it every time,’ said Todd.

‘The blood pressure in that part of the body is absolutely colossal because it’s right next to the heart. Our system’s designed so that if you don’t have a strong enough grip, the blood pressure will blow it out of the arm.’

Todd says the technology, that could soon be rolled out across the NHS, is frighteningly realistic. ‘It feels as if you’re pushing a needle through someone’s skin and you can feel it pop into the vein. I’ve had people feeling physically sick and almost fainting — it’s that realistic.’

 


Immersion is developing wearable technology that allow muscles of, say the hand, to pick up and manipulate virtual objects (above left), while the Virtual Veins system (above right) is designed to train medical workers how to stick needles in veins

UK Haptics is developing a series of add-ons for the system for training in other procedures such as neonatal intubation, epidurals, lumbar punctures, and suturing. It is also modifying SensAble’s core technology to develop end-effectors that look and feel like real surgical instruments, a surprisingly tricky engineering challenge.

‘For instance, when you place an injection you need to be able to see the fluid going into the patient at the same rate as which you press the end-effector,’ said Todd.

The company has sold about 40 of the machines and is in early stage discussions with a Strategic Health Authority, which Todd is confident will mark the beginning of a nationwide deployment. ‘The NHS sees it as a standard platform on which it can build its clinical training,’ he said.

But humans are not the only patients set to benefit from haptic technology. Vet turned computer scientist Dr Sarah Baillie has adapted SensAble’s technology to create a haptic cow simulator for training surgeons at the UK’s Royal Veterinary College.

Baillie said a haptic controller housed in the glass fibre rear half of a cow works alongside specially developed software that creates haptic objects representing features such as the cow’s reproductive tract.

‘As the hand moves around, the device resists the movement in such a way that it feels as if you are touching something,’ she said. An early version of the system augmented this feedback with a loud moo that would occur if the trainee vet pressed a bit too hard.

According to Baillie, the haptic training device has significant advantages over other methods and is great preparation for the real thing. ‘Teaching any procedure, particularly if it’s internal, is really difficult, because they can’t copy what I did and I can’t see what they’re doing, so how do you tell them what to do?’

Baillie is so pleased with the results that her team has also developed a haptic horse and a haptic cat — the latter used to train vets in external abdominal examinations.

Another interesting medical application of haptics is its use in the design of specialist medical components. For instance, engineers in Cardiff have been using Sensable’s technology to develop facial prostheses. As reported in The Engineer (14 July), the system enabled them to design and shape virtual ear prostheses as if they were working with a lump of wax.

Clearly, the ability to design custom components is not just attractive to the medical industry.

According to Sensable’s Chen, a number of customers from the clothing, jewellery and automotive industries are using the system to design products.

Looking further into the future, Immersion’s Ramstein believes that haptic design tools are set to play an increasingly prominent role.

‘Fifteen years from now imagine that you’re designing a car. Instead of having a touch screen or a 2D screen you would have a car rising from the desk — and the guys can touch it. Touching is important, as soon as you have a 3D image you want to interact with it.’

Beyond this, Ramstein believes that haptics will eventually completely reshape the way we interact with the digital world. ‘The vision is making the user interface the frontier between the digital and the real world less and less blurry,’ he said.

‘It is making this separation between digital and reality so small that when you interact with it, it almost becomes reality.’