Editor
Following a number of recent incidents, security researchers are becoming increasingly concerned about the security of our increasingly connected vehicles.
The prospect of your family car suddenly taking on a mind of its own and propelling you helplessly into oncoming traffic is a terrifying one.
And while it might seem a somewhat irrational fear, security researchers are in fact becoming increasingly concerned that the rise of the connected vehicle and driverless technology could leave the next generation of vehicles vulnerable to malicious attacks by hackers and viruses.
This concern has been fuelled by a number of cases involving existing vehicles.
Just last week Jaguar Land Rover announced that is was recalling 65,0000 Range Rover vehicles after a software glitch caused keyless vehicles to spontaneously unlock themselves. One driver even reported that the door flew open while the car was moving.
The company, which has issued a software patch to address the problem, reportedly feared that sophisticated thieves armed with simple hacking devices could exploit the glitch.
A similar issue was uncovered earlier this year on BMW, Mini and Rolls Royce vehicles equipped with so-called Connected Drive technology, a web-enabled driver assistance system developed by BMW. During tests, researchers were able to hack into vehicles operating this system and open doors and windows.
Even more dramatically, Chrysler was recently prompted to issue a security patch following a report in Wired describing how researchers remotely hacked into a moving Jeep Cherokee, activated its windscreen wipers, tinkered with its air conditioning, and even shut down the transmission.
It’s unsettling stuff, and won’t do much to reassure the many drivers who are wary about the growing use of driverless systems. What’s more, with connected car technology still, relatively speaking, in its infancy, it’s an issue that’s only likely to become more of a priority for automotive manufacturers, and transport legislators.
Indeed, the US government is already in the process of rolling out legislation that will compel manufacturers to work harder to protect car from hackers, whilst here in the UK the Department For Transport (DFT) is also said to be drafting new regulations.
We shouldn’t perhaps be too alarmed. The fact that these widely reported glitches have been uncovered by security experts rather than criminals is perhaps a sign that industry is on top of the issue. And it’s worth emphasising that the inherent usefulness of connected vehicle technology far outweighs the risks. Indeed, deployed correctly, these systems will help to remove human error and make driving a safer and more efficient process (if not a little dull, once the frisson of new technology has worn off).
But recent reports are a reminder that the more connected we become – on the roads, and in our homes and factories – and the more dependent we become on back-end systems, the more vulnerable we are to malicious attacks.
And this is an issue that all areas of industry, not just the automotive sector, will need to take increasingly seriously in the months and years ahead.
If “driverless” systems are or become available they should first be used to minimise the HGV left turning problem that is killing so many cyclists on city streets. Solve that problem and the population might begin to believe in the technology.
According to a friend in automotive software, we may expect >10,00 software glitches under the bonnet of a car now. The thought of, for example, threatware taking over my display panel and telling me what it can do to me while in the outside lane of the M1 is disconcerting, however unlikely it may be. Recent experience with a hire car on which the engine would not turn off only reinforces my suspicions that work is needed to give me confidence.
For those who doubt or mock, read say IEC65108, where one learns that software is validated not by testing but by checking the process used to develop it (for the lowest possible price) [my parentheses].
Time to market.
More important than safety and security. If you can sell a car you earn more money than if you can make a car safe.
If you voice objections, you “can leave the company if you don’t like it here”.
You want more training? Do it in your spare time. But make sure you are ready for business trips at any time, and answer all e-mails, even in your spare time and on holidays.
And no, we neither pay for your training nor if you miss a lecture while on business trips.
And no, no overtime payment for e-mail answering.
…
I ride a bicycle now.
Unfortunately there are some real idiots in the car (and aviation) industry who really shouldn’t be there. Anyone who thinks connecting an aircraft or car critical safety (make that any safety critical) system to the internet or via a wireless connection is an idiot and needs to be fired immediately.
Having on-board isolated systems is fine – its just when you leave your system open to external attack.
The comment ‘reported glitches have been uncovered by security experts rather than criminals is perhaps a sign that industry is on top of the issue.’
Is perhaps more worrying in its naivety, if the so called security experts have discovered a problem, then you can bet that the hackers will also be on to it. Its just whether there is any financial gain to make it worth their while (at the moment).
However I am looking forward to car hacking as an excuse for my next speeding ticket
‘wasn’t me gov it was a hacker in China!’.
We need legally enforced standards for development, testing, system audit by independent inspectors, and support for problems for 10 years after a model is last sold.
For example, search for the report by Philip Koopman on Toyota’s “Unintended Acceleration” problem that killed 89 people. Poorly designed hardware (single point of failure) and software (numerous flaws compared to, say, the MISRA guidelines) for something that was safety critical.
We need some minimum enforced standards like the aviation industry. Yes it will cost more, but maybe it will prevent frivolous and risky “features” being added by marketing men.
I think we should be extremely worried.Although electronics have become more and more reliable they are not infallible as some seem to think. Ithink the latest spate of collisions from driverless cars has shown that.If a CME can shut a city down then a tiny vehicle has no chance. EMC anyone ? I also have concerns about the security aspects which are just too numerous to mention !!
Legislation to enforce more rigorous testing/conformance to standards will never be able to keep up with the rate that software systems can evolve, nor with the novel methods hackers will come up with for breaking systems for fun or financial gain.
It will be more effective to simplify methods by which manufacturers can be very seriously financially affected by vehicle users, if their vehicles are hacked.
Daily Telegraph 22 July: ‘Hackers seize control of Jeep Cherokee using laptop plus phone’. Radio came on, wipers wiped, and picture of two hackers (*) appeared on digital display! Brakes cut (disabled) and vehicle went into a ditch.
*’Hackers’ were security researchers, and working with the driver who was ‘in’ on the hack. But what if he wasn’t?
Give me the old fashioned lock & mechanics any day. You can’t hack that with a computer!
We should be worried by the ‘hackable’ anything: but only of course if it is being disturbed by ‘them’. Its quite all right for ‘us’ to do what we want with technology, democracy, capitalism, market forces, political shinanigans….because ‘we’ are always right. Well aren’t we?
Mike B
I posted some time ago, when Autonomous cars were first touted that they would be subject to hackers and lo & behold, the first step is already happening. Easy to predict there is more and worse to come.
I posted some time ago that hackers would find a way of interfering with Autonomously controlled vehicles. The current hackers are merely the first step and it is easy to predict that there is worse to come.
it’s comforting to know that the glitches were uncovered by security experts, until you remember that many of the top security experts are reformed hackers.
“reported to be drafting new legislation…”
What did I read? That a bunch of ‘word-smiths’ are trying to creat a law (in words) that will stop software designers demonstrating their skills? Waste of time and paper.
It used to be said (by smart lawyers) that “if you pay us enough, we can drive a coach and four horses through any Act of Parliament.” And they regularly did!
Pass as many laws as you like and if the spirit of the ‘hackers’ [remember that farce -the biggest culprits were the meja and lawyers seeking secrets/commercial advantages in conducting cases] is such, they are now in a position to not only disrupt a potential opponant but cover their tracks in doing so. Come to think of it that is what the meja and lawyers do already!
On the day that details emerge of the third most senior person in the State spending a quite improper sum to travel a few miles to give a lecture entitled “bringing some integrity and sense to expenses….(I paraphrase)” pull the other one. It has both bells and whistles.
Mike B
GTA 20.2 will connect real life cars to video gamers all over the world. Then the dude in China will race against some American in real life in England on the M40.
I think the left turning HGV solution is for cyclists to apply a bit of common sense in not undertaking a 40T moving vehicle with limited visibility.
No hacking involved there.
No, we should not be “a little worried” about such dumb choices being made. WE SHOULD BE INTENSELY CONCERNED about vehicles being subject to both hacking and just plain failure. The first defect already in use in a few models, is that big red button that can only send a “please shut off” request to the engine control computer. But if the vehicle is already going 80 MPH because of a stuck throttle servo the computer will not switch off the engine because it decides that having power steering on is more important. Engine stop should be a hardware only control function, although also allowing a computer to switch off the engine would be OK.
As for the other, very real, potential problems posted, all we need to do is look at the example of microsopht, and it’s many failures, to understand the vast realm of disasters available. So rather than legislating reliability into the vehicle software, how about clearly defining the liabilities for that software’s faults and failures? That should be a simpler task, perhaps.
Allan Rhodes is missing the point. Electronics, meaning hardware, generally are now so reliable that the military use them ‘off the shelf’. In times past only ‘Mil-spec’ Electronic stuff was allowed (which meant games consoles were always way ahead of the military). It’s not the hardware that is the problem here. It’s the software.
I worked for a Computer company who employed many tens of Developers and to ensure failsafe software just two Software Quality Engineers to check their work for code errors. Needless to say what happened was that having a QA dept meant the developers were less careful in self-checking and having only two QA folk meant the overall quality of the software fell dramatically. It also meant they sold the software well before it was ‘safe’.
So… to answer the question ‘Should we be worried?’. Oh yes, very much so, and it seems to me that this is an ‘arms race’ the hackers will always win.