Could your driverless car decide to kill you - if it would save other lives?

Crash Rex
Crash Rex

It sounds like a rather far-fetched Hollywood sci-fi plot – but there’s a chance that your driverless car might one day decide to kill you.

Your car won’t murder you out of the blue, of course: it would only do it if, by killing you, it could save the lives of several pedestrians.

The technology is evolving – as are the laws around it – but most experts believe the cars will be programmed to save lives.

But whose lives will be saved? Yahoo spoke to industry experts to find out.

Self-driving cars are much safer than cars driven by people – with ROSPA research showing 95% of accidents involve human error, and in 76% of cases, a human is solely to blame.

MOST POPULAR STORIES FROM YAHOO UK:

Farmer who offered land to travellers for free to ‘upset village’ sells plot to ‘mystery buyer’
Poorly kitten left with huge bulging eyes due to mysterious injury
Why aspirin could ‘reverse’ the effects of tooth decay, according to experts
Apple’s iPhone 8 ‘could be a flop’ due to lack of interest, say tech analysts
Watch: Traveller ‘mauled by police dog’ in shocking footage

But the probability of an accident will never drop to zero – and in some cases, software will have to decide who lives and who dies.

Bart Jacobsz Rosier of app-enabled electric scooter start-up Bolt says that this is liable to become a very real question, very soon – as he believes that within a decade, 90% of production cars will have self-driving capabilities.

Lesser of two evils

FiveAI wants to launch a trial programme in 2019.

Rosier says, ‘There are scenarios where it will need to choose between the lesser of two evils. That choice will be defined by the value attached to a certain outcome.

‘That will in essence mean that (in a completely balanced scenario) you will be able to programme a car to have a preference towards people outside or inside the vehicle.’

Previous MIT research shows that most people agree that a driverless car should sacrifice its passenger to save, say, 10 pedestrians – but 76% agreeing that cars should be programmed in this way.

But when people were asked if they would buy a car programmed like this, most said they would prefer to buy one programmed to protect the passenger.

Dr Iyad Rahwan, from the Massachusetts Institute of Technology (MIT) Media Lab, said: ‘Most people want to live in in a world where cars will minimise casualties, but everybody wants their own car to protect them at all costs.’

Would you buy a car that would kill you?

Rosier agrees, saying that any car owner – if given the choice – will buy a vehicle which prioritises their own life.

‘Ethically speaking you’d almost need it to be a coin toss,’ he says, discussing scenarios where a car might ‘choose’ who dies.

Rosier says, ‘When ownership comes into play, eg. you owning the car, it would be weird if the car would prefer someone outside of the car to live than inside.

Can you truly own a self-driving car?

But what might change is the whole idea of car ownership, Rosier suggests.

Experts have suggested that when self-driving cars become more popular, the idea of ‘owning’ a car may change.

So, for instance, a person might summon a self-driving car to work, then summon one on the way home, much like one does an Uber today – saving on parking fees and other expenses of having a car ‘waiting’ 24 hours a day.

Rosier says, ‘I think [what happens in an accident scenario] will depend on how far the degree of ownership is shared.

‘If this is the case, then you can build the most ethical preference into the system. If not, you have to choose a side.’

Danger

Having cars which are programmed to protect their owners at all costs may actually introduce danger on the roads, Rosier says.

Rosier says, ‘A sense of loyalty, or bias towards the vehicle owner’s life could create dangerous situations and cause more harm than good. Again, it comes down to the way that machines will measure and be able to quantify the risks on the road, as they happen.’

Legal issues

If driverless cars are adopted widely, it’s almost certain that the laws around driving will change – and that could include laws governing what their software can (and can’t) do.

So while people might want a car which protects them at all costs – regardless of how many lives are lost – it might end up being impossible to buy one, as no manufacturer would want the legal responsibility.

Nick Rogers, head of motor and partner at law firm BLM says, ‘Legislators globally will be wary of trying to cover every eventuality that the algorithms controlling a fully automated vehicle might encounter.

‘Instead legislation might require the architects of the algorithms to meet certain globally agreed standards, promoted by the UN ECE, for the autonomous operation of vehicles within defined parameters (such as motorway driving) where activity is more predictable.

‘The vehicle manufacturers and their software programmers are in turn likely to seek statutory protections from “unforeseeable” combinations of events in less well defined environments. Vehicles may therefore become “self-limiting” in their fully autonomous operation.’