Independent, impartial advice for car buyers and car owners

Find an Expert Rating: 

Confusion over self-driving technology may increase car crashes

Drivers could be confused about their car's ability to avoid an accident

Our Expert Partners

Motorway 600x300

Sell your car with Motorway
Find out more

Motors 600x300

Find your next car with Motors
Find out more

Leasing dot com 600x300

Car leasing offers from Leasing.com
Find out more

ALA Insurance logo 2022 600x300

Warranty and GAP from ALA Insurance
Find out more

MotorEasy logo 300x150

Warranty, servicing and tyres from MotorEasy
Find out more

Mycardirect subscriptions – 600x300

Carsubscriptions from Mycardirect
Find out more


More self-driving technology is a good thing, according to the UK’s car insurers – but greater clarity is needed to ensure safety for all road users.

They want a clearly defined and regulated distinction between ‘assisted’ and ‘automated’ driving systems, which they say are creating what could be dangerous confusion amongst drivers and could actually lead to an increase in accidents.

The Automated Driving Insurer Group (ADIG), which includes virtually all of the UK’s car insurers, has released ‘Regulating Automated Driving’, the results of studies carried out with leading UK automotive test organisation Thatcham Research.

The paper states that there is a real danger of ‘autonomous ambiguity’ over the widely varying levels of driverless technology available on the latest cars going on sale. This could lead to drivers thinking their cars are better able to avoid an accident than it really is, which would increase rather than decrease the likelihood of a crash.

Insurers support increases in assisted-driving and self-driving technology, as they believe it will significantly reduce road accidents. They are highly supportive of driver assistance systems – such as those that act in the brief moments before a collision to aid the driver’s reaction, such as applying emergency braking. Fully automated vehicles, without the driver intervening in the car’s functions at all, also gain support, though these are still very much in development.

Autonomous technology is advancing fast, leading to confusion.

Taking back control

The issue comes with cars that can carry out most manoeuvres unaided by the driver, but will expect them to react and intervene (potentially at very short notice) in an emergency situation. This, say the insurers, poses significant concerns about public confusion and safety, particularly as different types of systems could be available on similar vehicles at the same time.

According to the research vehicle manufacturers argue that such systems can be safe, provided that drivers use them ‘as intended’. But it is not clear how different drivers will understand and use these types of systems.

The development of autonomous technology is moving fast and the insurers are calling on international regulators to ensure that new design standards for such vehicles clearly distinguish between Assisted and Automated systems.

According to the paper, a car should only be described as Automated when the driver knows they can leave it to control itself and be able to cope with virtually any road situation that might arise. If the car encounters something it can’t handle, it should be able to bring itself to a safe stop without the driver intervening.

The autonomous system should also be able to avoid all conceivable crash types and carry on working even if some of its systems fail. And after an accident, manufacturers and insurers should be able to access data showing clearly whether the driver or the vehicle is liable.

Nissan Leaf autonomous self-driving car

What’s in a name?

Drivers also need to know what they are buying, say the insurers, so the names manufacturers give to their autonomous systems need to make clear what they can – and can’t – do. Drivers must not be led into believing a system can take full control of a car when it can’t, and ‘hybrid’ systems, sitting in the grey area between assisted and automated, should be avoided.

Tesla, in particular, has come in for considerable criticism in the USA over its use of the name Autopilot to describe the semi-autonomous driving system on its cars. Consumer groups say that the name encourages drivers to put too much trust in the car’s ability to drive itself, when in reality the driver may need to intervene urgently to avoid an accident in common driving situations.

Car manufacturers have also long been guilty of applying their own names to universal technology, which further causes confusion among car buyers and drivers. Since the 1990s, there have been a variety of different names and acronyms to describe electronic stability control (ESC), such as ESP, DSC, VDC, VSC, DSTC, ICCS and so on. Currently, a number of different names are used to describe autonomous emergency braking systems. It’s no wonder customers get confused.

According to Thatcham Research CEO Peter Shaw, the risk of ‘autonomous ambiguity’ could actually result in a short-term increase in crashes involving such vehicles. “Vehicles with intermediate systems that offer assisted driving still require immediate driver intervention if the car cannot deal with a situation,” he says.

“Systems like these are fast emerging and unless clearly regulated, could convince drivers that their car is more capable than it actually is.”

Cars with suites of cameras reading the road will become commonplace in coming years.

The latest from The Car Expert

Andrew Charman
Andrew Charman
Andrew is a road test editor for The Car Expert. He is a member of the Guild of Motoring Writers, and has been testing and writing about new cars for more than 20 years. Today he is well known to senior personnel at the major car manufacturers and attends many new model launches each year.