Tuesday, 16 September 2014
masthead+quote+image
Advanced search

Should we be worried about advanced ID technology?

We’ve been slowly losing our privacy for years (or giving it up, depending on your view). It’s virtually impossible to walk up the high street, drive down a motorway or enter a public or commercial building without being caught on camera. But CCTV’s use in crime solving, the lack of a centralised monitoring system and the limits of the technology to actually identify people means that most people are prepared to begrudgingly put up with being filmed on a daily basis.

How will people feel, however, once a computer can spot them out of a crowd and instantly bring up reams of their personal information? Over the next few years we’re facing a revolution in identification and monitoring technology that is likely to make scenes from the film Minority Report – where businesses and authorities use iris scans to ID people at every turn – much more realistic. And soon we may find ourselves identified not just by our faces or voices but also by the way we walk, our heartbeats, brainwaves or even our breath.

Researchers at Southampton University are among those developing gait recognition systems. CCTV footage can already be analysed to identify people based on how they walk but the Southampton team have built a camera-filled tunnel that can build up a detailed picture of individuals’ movements that the researchers believe could be deployed at airports or other secure buildings: never mind keys, passports or facial scans, the system could decide whether or not to let you in before you’ve even reached the door.

A team at Wolverhampton University, meanwhile, are developing a password system based on how the brain reacts to different images. When a subject sees the picture they chose as their password, the unique electrical signals produced by their brain could be detected and used as authentication of the subject’s identity.

Meanwhile, engineers at Draper Laboratory in the US are experimenting with remote ways of detecting individual’s heartbeat patterns, which appear to be unique and retain their shape even as the heart rate goes up or down. This could allow paramedics to call up a patient’s medical records even if they are unconscious and unable to identify themselves.

Similarly, scientists at the Swiss Federal Institute of Technology (ETH) in Zurich are investigating whether mass spectrometer readings taken from breath samples can reveal a unique signature for each person as well as providing information about their health.

All these technologies have potentially very useful and beneficial applications: securing identities, data and buildings while providing an easy access route for those who have authorisation. But they also have the potential to violate privacy because they could be used to identify people without their permission or even knowledge. And that’s where people are likely to get uneasy.

Google’s Project Glass, a head-mounted computer that looks like a pair of glasses and can upload film of your surroundings to the internet whenever you want, has been heralded as the beginning of a new era of wearable electronics (check out The Engineer’s digital magazine next week for an in-depth look at this very topic).

But before the devices have even been made available to the public, a backlash against the technology has started, most notably via a campaign group called “Stop the Cyborgs”. Campaigners warn Project Glass could enable thousands of people to record everything that’s going on around them, making public actions and conversations people thought were private and uploading the data to a central database (Google). There’s a fear that, as recognition technology improves, your every move and thought could be catalogued and searchable, by other people, by businesses and the state.

It’s a scary idea and the use and regulation of this kind of technology deserves serious consideration, both by us as the public and by those developing such equipment. But the recent trial of recognition technology at Heathrow airport gives us an interesting example to chew on. A fingerprinting system was withdrawn because of its associations with criminality, but facial recognition technology that can more easily identify us and link to other information about us (e.g. photos on the internet) is proving more palatable.

So perhaps we’ll be comfortable with advanced identification systems as long as they are presented in the right way and don’t inconvenience us. The danger then becomes that, like often happens with CCTV, we forget these systems are there and stop asking who has our data and why.


Readers' comments (3)

  • Perhaps even more scary is the thought that you might be incorrectly identified as someone else (or vice versa) and of course as we all know, if its done by computer it has to be true!

    Unsuitable or offensive? Report this comment

  • And your identity file gets switched with someone else and you end up in a GITMO for being you and of course the Computer is never wrong.

    Unsuitable or offensive? Report this comment

  • Our Roman ancestors knew a thing or two about the control of the masses.
    'Bread and circuses' offered by the Patricians to the plebs (that's us?) to keep them happy.
    but to be fair they also realised the importance of checking the checkers!
    "Quis custodies custodiarties" (its 56 years since I last used a Latin verb in anger so apologies if that's not quite correct) literally 'who checks up on the checkers'

    At great ceremonies like the installation of a new Emperor or the return of a triumphant General and his army... they had a slave, standing behind the dignitary and his sole task was to call out 'Remember you are but a man' this was to remind the leader of his mortality!
    I know a few politicians and captains of industry, indeed some military persons to whom this ought to be done. daily.

    Unsuitable or offensive? Report this comment

Have your say

Mandatory
Mandatory
Mandatory
Mandatory

My saved stories (Empty)

You have no saved stories

Save this article