Anti-terror technology tool uses human logic

System bridges gap between mathematical and intuitive thinking


The system could play an assistive role in detective work

An ‘intelligent’ decision-support system that is able to apply the logic used by detectives to identify suspicious behaviour is hoped to become the latest tool in countering terrorist activity.

The technology has received more than £2m of government funding and over the next two years it will be the subject of a joint research project being undertaken by 10 British universities.

Researchers believe the work could lead to an interactive imagebased software that can be used on touch-screen table-top displays and other large-screen systems to better manage the huge amounts of data collected in connection with alleged terrorist plots.

Leading the research, Prof Chris Hankin of Imperial College London, said that the speed of data analysis and distribution relating to suspicious
activities remains one of the key challenges in the UK’s anti-terrorism strategy.

Newer forms of communication, he claims, have both aided investigative work and added to the problem with operators increasingly dealing with large amounts of fragmentary information from a wide variety of media platforms.

To solve the issue, Hankin is working with a multi-disciplinary team of academics to develop a system that will be able to make assumptions on missing information in a similar way to human logic in investigative work.

He said: ‘If we can find out how analysts think and draw their conclusions, we can input the same reasoning into a software system that will be able to quickly highlight patterns and linking events from partial and contradictory information.’

Interviews with detectives to understand their reasoning and flag up what they consider to be suspicious indicators are hoped to lead to an advanced machine learning system that will bridge the gap between mathematical and intuitive thinking.

Dr Eric Atwell, who is contributing his textual analytics expertise to the project, said that semantically extracting meaningful relationships between different datasets could allow intelligence agencies to uncover associations that would have otherwise been missed.

He added: ‘They say it’s a bit like looking for a needle in a haystack. I prefer the analogy of looking for threads in a haystack — things connecting things that we might not have seen before.

‘For example, if in several of the phone calls a particular word occurs repeatedly at a frequency that is comparatively higher than its average, it would be flagged up by the system and could lead to the identification of linking information in other documents such as CCTV or e-mails.’

Up until now, research on semiautonomously tracking terrorist activity has been restricted to information that is already in the public domain as a result of privacy restrictions. This work is expected to be the first in the UK to track ‘real’ data for terrorist information.

Atwell said: ‘Identifying trends in newspaper reports is easier. When a journalist writes a story, they know what they are trying to get across and deliberately highlight that information. A terrorist, when talking to a fellow terrorist, is unlikely to have that aim in mind.’

With the vast quantity of this type of data, concern over an investigator missing information that has been summarised by the system is one of the difficulties the team expects to face. However, Hankin was keen to note that technology could only ever play an assistive role in detective work.

He said: ‘The role of the human investigator is, and will remain, crucial.’