“A large cyber-terrorist attack targeting the operating systems of many self-driving vehicles simultaneously could cause mass casualties” – that is the alarming scenario presented by MPs after their investigation into autonomous cars on British roads.
After a 15-month enquiry, the transport select committee has issued a hard-hitting report highlighting the hazards self-driving vehicles could create.
The committee concludes: “Self-driving vehicles pose cybersecurity risks, broadly because of their connected rather than automated capabilities.
“Cyberattacks may prove more dangerous if a vehicle is self-driven.”
Ashley Feldman, Transport and Smart Cities programme and policy manager at a trade body, Tech UK, set out several “significant” risks to the committee, saying: “Your steering, your braking, your acceleration and even the operation of the airbags … could be taken over by a malicious actor.”
Jesse Norman, minister for Decarbonisation and Technology, told the MPs: “In some respects it looks like it is inevitable, that there will be [cyberattacks] and that some of them will be successful.”
A year ago the government said: “By 2025, the UK will begin to see deployments of self-driving vehicles, improving ways in which people and goods are moved around the nation and creating an early commercial market for the technologies.
“This market will be enabled by a comprehensive regulatory, legislative and safety framework, served by a strong British supply chain and skills base, and used confidently by businesses and the public alike.”
The Highway Code already addresses the behaviour of drivers of autonomous vehicles, saying: “While a self-driving vehicle is driving itself in a valid situation, you are not responsible for how it drives. You may turn your attention away from the road and you may also view content through the vehicle’s built-in infotainment apparatus.”
At present self-driving vehicles are largely limited to a bus shuttle across the Forth Road Bridge in Scotland and the Heathrow Pod, which connects Terminal 5 with a car park along a dedicated track.
In April 2023 the government gave permission for Ford to activate its BlueCruise self-driving system on 2,300 miles of UK motorway – though the technology is currently available only on the 2023 Ford Mustang Mach-E, a pure electric vehicle costing upwards of £50,000.
The government told the inquiry that “by 2035, 40 per cent of new cars in the UK could have self-driving capabilities”.
Ahead of the wider deployment of autonomous vehicles, MPs on the committee investigated the opportunities and risks.
The conclusion: “It is a crucial time for self-driving vehicles. Their potential to revolutionise transport is obvious, but, as the technology matures and real-world uses become less hypothetical, many challenges remain.
“There is a broad range of possible uses for self-driving vehicles from HGVs and buses to taxis and private cars. We believe they have the potential to improve transport connectivity with significant safety, productivity, and mobility benefits.”
But the report says: “Over the last decade, progress in this technology has failed to meet many of its promoters’ predictions, and this has bred understandable cynicism. Safety must remain the government’s overriding priority as self-driving vehicles encounter real-world complexity.”
The MPs warn of the risk of erosion of driver skills, saying: “Greater automation will reduce time spent driving. Over time drivers may become less practised and therefore less skilled. Conversely, the demands on drivers will grow as they will be called upon to retake control of vehicles in challenging circumstances with little notice.
“The government should set out a strategy for the future of human driving in a world of self-driving vehicles. This should include possible changes to driving tests and a plan to ensure that all drivers fully understand self-driving vehicles and both acquire and maintain the necessary skills for taking control of a vehicle in all circumstances.”
Becky Guy, road safety manager for England for the Royal Society for the Prevention of Accidents (RoSPA) told MPs on the committee: “The role of the driver effectively moves from operating the vehicle to becoming a system supervisor. The real challenge is keeping that person engaged and in the loop of the vehicle.”
The government believes self-driving vehicles could make the roads safer, arguing in a recent publication: “Self-driving vehicles won’t get tired or distracted. They won’t worry about the children in the back seat, stress about their next meeting or be anxious to get home for dinner. They are likely to react more quickly than a human, remaining consistently able to assess how to drive safely in a fraction of a second.”
But Professor Nick Reed from Reed Mobility told the committee: “The risk is that the driver is not ready to resume control and that control is handed back. I think we will need monitoring of the driver, or the occupant of the driving seat, to show that they are indeed ready and able to take over, and also that the vehicle is capable of managing the situation should the driver not be ready or able.”
Steve Gooding, chief executive of the RAC Foundation, summed up the different degrees of driver involvement in an autonomous vehicle as “hands off, eyes off, nod off”.
“Hands off means that the vehicle will drive itself, but you need to keep alert,” he said. “Eyes off means that the vehicle will look after itself entirely. Nod off means that you can go to sleep and the vehicle will take you where you want to go.”
Alex Kendall, co-founder and CEO of self-driving AI technology company Wayve, said: “We welcome today’s report which sets out the urgent need for legislation so the UK can reap the benefits of self-driving vehicles and remain a leader in this rapidly growing sector.
“At Wayve, we’re developing technology which powers self-driving vehicles using embodied AI. Self-driving is an inspiring example of frontier AI in action, and we’re proud to be helping to develop technology that will cement the UK’s global leadership in AI.”
What the Highway Code says about self-driving vehicles
A self-driving vehicle’s ability to drive itself may be limited to certain situations or parts of a journey. Things like the type of road, time of day, weather, location and speed may affect this. You should follow the manufacturer’s instructions about when and how to use the self-driving function safely.
While a self-driving vehicle is driving itself in a valid situation, you are not responsible for how it drives. You may turn your attention away from the road and you may also view content through the vehicle’s built-in infotainment apparatus, if available.
But you must still follow all relevant laws. You must be fit to drive (for example, you must be within the drink-drive legal limits and not be under the influence of drugs).
The vehicle must be road legal (for example, it must have an MOT certificate, if applicable, and it must be taxed and insured). The vehicle must be roadworthy. You will also still be responsible for your passengers and anything else you are carrying,
If a self-driving vehicle needs to hand control back to the driver, it will give you enough warning to do this safely. You must always be able and ready to take control, and do it when the vehicle prompts you. For example, you should stay in the driving seat and stay awake. When you have taken back control or turned off the self-driving function, you are responsible for all aspects of driving.