Magic touch: haptics at the man-machine interface

8 min read

Haptic technology is carving out a successful role in medicine and sport.

In the past 30 years progress in visual, imaging and screen-based technologies has been quite astonishing. We have gone from great hulking cathode ray tubes to mobile devices that capture and display high-defi nition video. But it could be argued that these technologies have reached a plateau. It is interesting to note that 3D cinema and television has been something of an anti-climax, not quite delivering on the immersive experience promised. By focusing solely on the visual senses for so long, have we neglected the subtle but important infl uence of other human perceptions — namely touch? Certainly a rapidly growing body of researchers worldwide and a number of companies believe that to be case and are implementing ‘haptic technologies’ in a broad range of applications.

The MotivePro ‘haptic suit’ being tested at Birmingham City University on a top-ranking rhythmic gymnast. The prototype system features a network of low-profile flex sensors, processors and motors and provides tactile feedback.

Of course, the field is not a new one and the ‘touch revolution’ has been promised before. But previous attempts have tended to bolt on crude force feedback without a proper consideration of why exactly it is there and what function it should be aiding.

One of the drivers of recent advances is a greater understanding of the psycho-sensory mechanisms of touch in the context of specific problems (another is the availability of cheap, robust technology, such as actuators, sensors, processors and test rigs).

For this reason you tend to find that people working in the field these days have a very diverse range of backgrounds - including psychology, medicine, performing arts and sports.

‘Haptics researchers don’t tend to be the people developing the core fundamental technology - a brand-new actuator or brand-new sensor or new material - because that’s so removed from the application,’ said Katherine Kuchenbecker, director of the Haptics Group at the University of Pennsylvania and one of the leading worldwide experts in the field. ‘But we can take components that have been invented or developed in commercial or academic laboratories and put them together in clever ways and test them with human users. Frequently your intuition about what might work gets you halfway, but you really have to try it and there’s an art to getting it to work really well. The sense of touch is amazingly sensitive so there are often some peculiar artefacts.’

In common with many experimental technologies, haptics has found a worthy cause (and a viable market) in medicine. So-called minimally invasive surgical techniques have allowed surgeons to operate through small incisions, thereby reducing complications and improving recovery. But by doing so they relinquish the surgeon’s sense of touch. In the case of laparoscopic ‘keyhole’ surgery it is diminished considerably and with remote ‘master-slave’ robotic surgery it is entirely absent.

Get a grip: Bristol’s palpating gripper replicates the movements surgeons make when feeling tissue in clinical practice

This is a real loss for cancer surgeons, for example, who like to feel the tissue they are cutting out - an important means of double-checking where the tumour is and if it is malignant or benign. Scanning cannot always give all the information needed in advance.

Unlike previous versions, the most recent iteration of the market-leading Da Vinci robotic surgery system now features a degree of haptic feedback in its operation. But it has been retrofitted and is more kinesthetic rather than cutaneous (see box below.

A research team at Bristol Robotics Lab believes the next generation of surgical robots should be developed with haptics as a key design consideration from the outset.

‘Currently a lot of the tools are based on pliers, so you have a pivot point and gripper, and either one jaw stays fixed and the other moves, or they both move together,’ said Dr Adam Spiers of Bristol. ‘It was quickly obvious that this rigid pinch motion is not enough to get rich tactile information.’

“Currently a lot of the tools are based on pliers; it was quickly obvious that this rigid pinch motion is not enough to get rich tactile information

Adam Spiers, Bristol Robotics Lab

Spiers and his team spoke with surgeons to find out the sort of palpating movements they use when feeling tissue in clinical practice. These included lateral rubbing (a similar motion to rolling a cigarette), enclosure, contour following and unsupported holding. The challenge was to replicate these complex movements in a tool while keeping it low profile and still suitable for keyhole incisions.

The gripper can be folded for insertion and retracted, to give a range of motions.

Their resulting ‘palpating gripper’ was presented at the IEEE Haptics Symposium in March. It can be folded for insertion then retracted to give a range of motions.

Although the device currently lacks any electronics for sensing and feedback, Spiers believes it provides an ideal launching pad for future surgical robots. ‘If you mounted a tactile sensor on the finger then you could use the thumb to kind of press the object against it and manipulate it around the sensor,’ Spiers said.

Haptics has been used for a long time in surgical training simulations - especially dental procedures - using basic desktop rigs. Recently though, there has been a step-change in complexity thanks to the availability of new hardware.

A team at Bournemouth University, headed by Prof Venky Dubey, has developed a simulator for trainee anaesthetists to perform epidurals. This is a particularly difficult procedure to replicate from a tactile perspective because the needle advances through numerous tissue types - skin, ligaments and between vertebrae to the epidural sac - each having their own telltale feedback.

The team modified a Novit Falcon master-slave desktop rig (see box) and coupled it to a 3D modelled graphical simulation including 26 vertebrae that can be flexed and rotated. The simulator can even be adjusted to match the height, weight, age and body mass index of any patient, which can all influence haptic sensations.

One area where the visual medium has long been used but has perhaps reached its useful limit is sports science. Super-slow-motion video analysis and motion capture are frequently used in the coaching of elite athletes. But they can only show athletes their movements retrospectively. It would be highly desirable to have a technology that can facilitate the modification of movements in real time as the athlete is performing. This is what Gregory Sporton, professor of digital creativity at Birmingham City University and former professional dancer, has been working towards.

‘It’s a straightforward sport-psych thing. If you want to improve somebody’s golf swing there’s no point in showing them an animation of the exact trajectory of their swinging and saying “you’ve just got to correct that a bit”. It doesn’t work,’ he said.

Sporton had previously been working with motion capture and exoskeletons, but found them ‘clunky, expensive and slightly misleading’. ‘I then started looking at how to improve motor skill learning. One of the things that came up in the psychology literature regularly was the difference between using different modes of feedback. The mode that seemed to have the most impact was tactile feedback.’
The challenge was to put this basic principle into practice in a robust, cost-effective way in sports and dance. Noting the increased availability of a range of low-profile flex sensors, processors and motors, Sporton looked at integrating them into a ‘haptic suit’.

The idea was that the sensor networks would ascertain where the limbs and torso are in three-dimensional space and provide either tactile warning vibration when the athlete or dancer strays out of a predefined good form range or a progressive increase in vibration as they approach a ‘sweet spot’ position.

The resulting MotivePro prototype system is highly customisable, but Sporton prefers to keep it simple and focus on the position of a single joint at a key point in a long sequence. He has been testing the suit with one of the UK’s top-ranking rhythmic gymnasts.

Steve Wanless of Birmingham City University helps to strap the haptic suit onto a test subject

Ultimately the ambition is to work with group gymnastics and choreographed dance to make sure each performer is precisely synchronised. The system could also be of use for blind athletes and in stroke rehabilitation.

While much of the research in haptics is actually carried out by creative individuals using existing off-the-shelf components, there is some ongoing work by engineers to develop the underlying technology - mostly in the US. ‘In my group, we’ve really focused on the high-frequency vibrations because they let you know the tool has started contacting an object, or has slipped,’ Kuchenbecker said.

She has developed a ‘haptic camera’ that is similar in appearance to a paint brush. When drawn over an object, the device can record these high-frequency vibrations to produce a recorded ‘haptograph’ of the surface texture. This can be uploaded and filed, then later experienced retrospectively by using a stylus device. A particularly interesting potential application would be in museums to allow visitors to ‘touch’ precious archaeological objects and artefacts.

“We’ve focused on high-frequency vibrations because they let you know that the tool has contacted an object, or has slipped”

Katherine Kuechenbecker, University of Pennsylvania

Over at Northwestern University meanwhile, Prof Edward Colgate is looking at advanced haptic feedback for near-ubiquitous touchscreens in mobile phones and tablet computers.

A long-promised feature (always seemingly just around the corner) is to deliver the tactical experience of raised buttons on a keyboard, which can be called upon when required. Despite numerous research laboratories claiming to have the technology to do just this, it has yet to materialise in commercially available devices.

Nevertheless, Colgate is confident that from a cost and practicality point of view his group is heading in the right direction. His team favours perceptual tricks with the relatively mature technology of piezoceramics rather than creating actual miniature landscapes with exotic electroactive polymers as others have tried.

Still, his piezos are no slouches and can vibrate at 20,000 times a second with an amplitude of less than a micron - which is key to their collective action.

‘If you have a finger moving across a flat surface but you apply to it the same pattern of forces it would experience if it was going over a bump - well, it’ll feel very much as if it’s going over a bump. That’s a good example of a haptic illusion. Some of the cues you would naturally encounter are there and that’s enough to convince the mind,’ he said.

Configured in the right way, this would be the basis for the ‘on-demand’ keyboard; but Colgate has greater ambitions.

‘What I’ve been working on is controlling the shear forces between a finger and a surface. I want to develop a technology that can push your finger in any direction along the surface and do so with varying levels of force. With this you can create a virtual environment that you can interact with in a physically realistic way. So if you’re playing a game such as Angry Birds and you want to stretch that slingshot, as you pull it back you feel that stretch.’


Touching on the fundamentals of haptic technology

Haptics is the study of human touch and interaction with the environment through touch. And so haptic technologies seek to exploit, enhance or replicate aspects that interact for the purposes of entertainment, ergonomics or functionality.

‘Reach your hand into your pocket, pull out your key and unlock your car, but do it without looking. It’s a trivial task right? And yet that’s entirely haptic. When you think about it, it’s an incredibly difficult task - we can’t program robots to do anything like that,’ said Prof Ed Colgate of Northwestern University.

Information from the human sense of touch can be classified into two categories: cutaneousand kinesthetic. Cutaneous information is provided via the mechanoreceptive nerve endings in the skin of the human hand. It is primarily a means of relaying information regarding small-scale details in the form of skin stretch, compression and vibration.

Kinesthetic sensing encompasses larger-scale details, such as basic object shape and mechanical properties - compliance, for example.

There are four different types of mechanoreceptive cells in the fingertips that can measure pressure, lateral stretching, vibration and temperature. All of these sensations together give the impression of what you are touching.

sector analysis

Haptic technology developments span a range of different areas, including automotive and safety

Humans can discern haptic surface properties such as stiffness and texture through an intermediate tool - an ability that stems partly from the evolutionary phenomenon of distal attribution, in which a handheld tool comes to feel like an extension of one’s own body. This wealth of information guides both exploratory and dexterous manipulation.

For simulating or remotely performing tasks where you want to recreate the haptic tactile experience of the real thing, research groups have often used desktop manipulators hooked up to a computer. These provide a relatively cheap, accessible and customisable research tool to address a variety of questions.

One of the first commercially available systems was Sensable’s Phantom, which essentially comprises a stylus pen mounted on a pivoting jointed arm. It was used in the first attempts to simulate simple scalpel-based surgery for training purposes. This has been largely superseded in recent years by the Novit Falcon, which has six degrees of freedom and offers more sophisticated feedback.

One to watch is the forthcoming Magnetic Levitation (MagLev) haptic interface from Pittsburgh start-up Butterfly Haptics that, with just one moving part, can apparently provide incredibly fluid and realistic feedback.

For embedded and wearable haptics, sensors, processors andactuators have enabled a number of interesting applications. Carnegie Mellon University has a haptic steering wheel that, when linked to a GPS navigator, provides tactile direction prompts (noting that visual cues can be ambiguous). Sheffield University meanwhile has developed a haptic helmet for use by fire-fighters entering smoke-filled buildings. It uses ultrasound radar to sense the environment and provide graded vibration as the wearer gets nearer to obstacles.

Yet there is certainly a need for more discreet, lower-profile devices and recent advances in plastic and textile-based electronics could deliver this. Disney Research is apparently developing haptic clothing that can quite convincingly reproduce the feel of subtle tactile phenomena, such as water trickling across the skin, for film-goers and gamers.