In an increasingly connected world, the safety and resilience of devices is more important than ever. A new report is highlighting the challenges of being smart on security. Stuart Nathan reports.
When EM Forster exhorted his readers to “Only connect!” at the end of his novel Howards End, he couldn’t have imagined how connected we would all become barely a century later. Not only does ubiquitous internet mean that we are plugged into media services constantly, but it is becoming increasingly difficult to buy technology that is not ‘smart’ in some way. For most of these devices, that smartness derives from connectivity to the Internet and to other devices and systems.
The average UK house already contains around 15 connected devices, some obvious such as phones, laptops, tablets, televisions and smart meters, and some much less so, such as kettles, coffee makers, thermostats and switches. This number will only grow in the coming years.
The workplace, and particularly the manufacturing workplace, is even more connected. The rise of Industry 4.0 and the Internet of Things is seeing sensors and other monitoring equipment proliferate on almost every piece of equipment. Some of these are hardwired into IT systems that oversee and fine-tune the operation of machinery in the factory, while others – such as tools used on production lines in the automotive and aerospace industries – use wireless technologies to receive instructions to alter their operating parameters and to send data back to the factory IT systems.
All this poses risks that have not been seen before in either the domestic or workplace environments. From the earliest intimations of connected systems, the risk of incursion by digital criminals was foreseen by science fiction writers and other futurists, and cyber attacks are indeed a constant threat, with instances occurring in healthcare systems, critical infrastructure and in the entertainment industry.
But criminal activity is not the only threat. When everything is connected, knock-on effects from non-malicious IT outages can be severe; as was found with recent floods affecting electricity substations in Lancaster, which subsequently affected landline and mobile telecommunications, and an outage at British Airways caused by a power failure that threw schedules and reservations into chaos.
Against this background, the Royal Academy of Engineering recently launched a report into the safety and resilience of cyber systems, working alongside PETRAS, the EPSRC-funded Cyber Security of the Internet of Things (IoT) Research Hub.
Bringing together specialists from the universities of Warwick, Oxford, Surrey, Lancaster, Southampton and Cardiff with UCL and Imperial College in London, PETRAS (privacy, ethics, trust, reliability, stability and security) was set up in 2016. It is co-funded by industry and involves more than 120 academic, industrial and public sector partners, looking at social and technological issues, aiming to solve potential conflicting interests of government, industry and academia.
Safety and resilience are distinct issues, explained Nick Jennings, vice provost and professor of artificial intelligence at Imperial College, one of the report’s authors, at its launch. “Safety for us is what happens to a system when it’s operating normally. You want it to do particular things and not do other things under normal operation, and that’s how you construct any digital system. Resilience is where the system might be under stresses and strains and you want it to be able to recover from events, whether malicious or otherwise, and not end up in a bad state.”
Connected devices pose a dual risk in cybersecurity terms. The device itself can be penetrated and its data accessed, which can be a serious problem, sometimes in unexpected ways. For example, in the domestic environment, a robot vacuum cleaner could provide hackers with a floor plan of the home, and a thermostat could tell intruders whether a home was occupied. But this is not the only risk. If the device is connected to other systems – for example, a smart meter might have a link to the financial records of the house owner because of its use in billing – the hack could be used as a back door for “lateral movement” and financial cybercrime.
There are already examples of such lateral access being achieved in tests of hacking. For example, a wireless tyre pressure gauge manufactured by Bosch was used in a simulated hacking exercise to gain access to the engine management system and other software in a Mercedes car. The gauge is intended to give drivers useful information about the status of their vehicle, but this vulnerability was unforeseen.
In another automotive exercise, hackers found the problem with hacking into a Jeep’s systems wasn’t the hack itself, but confining the penetration to a single vehicle rather than a large number of them.
Cyber vulnerabilities have also been found in CT and MRI scanners, and in insulin pumps. In the TV thriller Homeland, one storyline saw a US politician assassinated by a hacker using his pacemaker to induce a heart attack; which is reputed to have so alarmed then US vice president Dick Cheney that he asked the Secret Service to assess the vulnerability of the software embedded in his own pacemaker. Pacemakers and insulin pumps send data wirelessly so that clinicians can monitor the health of their patients, but this could have the unintended consequence of rendering them vulnerable to attack.
Another of the report’s authors, Paul Taylor, UK lead partner for cyber security at KPMG, recalled a case where smart lightbulbs proved themselves to be a liability. Their original software – which has now been changed – allowed the first lightbulb plugged into a house to access the household’s Web ID and network passwords, which was then shared with all subsequent lightbulbs plugged into the circuit. “This was the system trying to be helpful,” he said. “It meant that you didn’t have to re-enter the details every time you put a new lightbulb in, but somebody figured out that you could pretend to be a lightbulb and get access to all these details. When that became clear, the software was replaced.”
Rachel Cooper, professor of design management and policy at Lancaster University, and theme lead at the PETRAS IoT Research Hub, highlighted another safety problem with connected systems. “We’ve done some research with a smart kettle, and we noticed it leaked data to a random server in Iceland,” she said. “You can download apps on your mobile phone so you can activate all these items in your home. So I’ve got the smart kettle app, and I can turn it on with an app I downloaded, but I can also turn it on from my sofa when it’s empty.”
If these vulnerabilities exist in domestic appliances, it’s easy to be alarmed by the potential for risk in the far more connected environment of an Industry 4.0-enabled modern factory. Paul Taylor, in conversation with The Engineer, said that cars are a good model for this because they are engineered to meet such exacting standards in terms of safety.
The RAeng/ PETRAS report, which aims to begin setting out guidelines to minimise such risks, states that configuration of connected devices is currently too difficult, and that manufacturers and distributors should be responsible for ensuring that devices are much more secure when they come out of the box. “If you’re buying a connected device, you ought to know what standard of safety it has been produced to meet, and it should be fit for purpose,” Taylor said. “We believe that government and regulation have a role to play here. Government has a great convening power both internationally and nationally to bring people together to start setting standards, whether those are going to be through the BSI (British Standards Institute), based on US NIST (National Institute of Standards and Technology) regulations, or through the Consumers’ Association. From there, it’s a short hop and skip to say we can start developing some common standards. We are actually quite good at that in the UK, so why shouldn’t we take the lead?”
Nick Coleman, chair of the IET information technology policy panel and former UK national reviewer cybersecurity for government, added another warning. “All of these standards have to be global, really, because we run global infrastructure with global supply chains and so you need to have clear interoperability between certifications,” he said. “Preferably we need as global an approach as we can.”
So if the manufacturer is responsible for standards, what is the responsibility of the user of connected equipment, whether that is in the home or in industry? A shift in thinking may be needed here. Previously, equipment has been regarded as ‘dumb’. But in the case of modern appliances, devices and equipment, many more things need to be thought of as computing devices, and these need the same sort of care as any other computer.
Specifically, security updates and fresh iterations of operating systems have to be downloaded and installed as they become available, and this can only be the responsibility of their owner. “If you say to someone, I’ve designed a smart kettle and have given it a label or a kitemark, the expectation is that you don’t go back and patch your kettle every couple of weeks even though you do on a computing device,” said Coleman.
Taylor went into more detail. “At the point of sale it should be quite clear to what level of security standards it has been built, if any, and what the process is for updating those security standards in the future, again if any. So ideally we’d like to see manufacturers being quite clear what security standards they use, when the standards exist, and how they can be updated. And then from that point, responsibility will pass across to the user to make sure that updates are uploaded and installed in a timely fashion to make sure that their equipment is as secure as it can be at that point in time, in the same way that you would with your own computer. You buy your laptop or tablet in a certain state and it’s your responsibility thereafter that you apply the patches and update the antivirus.”
Updating is particularly important for industrial systems, Coleman stressed, because unlike domestic appliances and personal devices, these tend to have a longer lifetime. “We need to remember that systems tend to last longer than you think they will, particularly in critical infrastructure; you might think ‘I will replace my smart phone every couple of years,’ but legacy technology in some of these systems lasts about 20 years, so we’re talking about building it now and getting a design right for the resilience as well as usability and other factors to consider, but we are also talking about its life-cycle: how does it get maintained, how does it get backed up.”
It’s also important to remember that industrial systems operate in a particular, and sometimes challenging, environment. “Think about it like an oil rig. It’s not sitting in a shiny office, these are systems that have to operate in the environment in which they are built, so you have to build resilience for that.”
The idea is to make helpful contributions towards increasing confidence in security and safety rather than running around with our hair on fire and saying don’t do it
It’s a challenging world, and it’s not going to get less complex. But it’s important not to lose sight of the advantages that connected technologies bring. “The first thing is to recognise putting digital and physical systems together in an integrated way is a good thing to do,” said Nick Jennings. “There’s lots of benefit that one can get from that in terms of monitoring, in terms of measuring, in terms of constantly interacting with systems, so that’s a good thing.”
Moreover, raising these concerns is not an attempt to scare people off, Taylor stressed. Factory owners are often nervous about installing new technologies like Industry 4.0 or IoT-enabled devices, and Taylor is anxious not to increase that anxiety. “I think it is the opposite: by raising concerns like this or the thinking we put in the reports of raising the issue of standards and of getting a common set of standards, that will help Industry 4.0 to become more easily adopted and more successful. The idea is to make helpful contributions towards increasing confidence in security and safety rather than running around with our hair on fire and saying don’t do it. That isn’t the idea at all.”