Self-driving car users shouldn't be accountable for accidents, report suggests
UK governments to consider legal changes to define what is and isn't a 'self-driving' vehicle
Drivers should be reclassified when operating a fully autonomous vehicle and not be held legally accountable for any road accidents, a new report has proposed.
Rather, the company behind the driving system should be responsible, according to the law commissions for England, Wales and Scotland.
The law commissions were asked to look into self-driving cars back in 2018 with the ultimate goal of publishing a series of reports for a regulator framework for autonomous vehicles and their use on public roads.
While driverless car technology has been deployed in various settings, the technology is still not ready to be fully implemented on the UK's roads.
The report, which will be laid before UK, Scottish and Welsh governments for consideration, would potentially force wholesale changes for both car makers and road safety laws.
It recommends that a clear definition of 'autonomous cars' is legally stipulated to avoid future confusion. This, it states, should create a "clear bright line" between systems that require attention and those that do not. This would help tackle the "problem of passivity", according to the report, which cites research into human behaviour that shows people find it difficult to monitor a task passively, rather than when one is fully engaged.
"Once their eyes and minds wander away from the road, they have limited ability to respond appropriately to events," the report said. "They should not be held accountable for failing to notice problems."
Seven leading machine learning use cases
Seven ways machine learning solves business problemsFree Download
In 2018, an Uber test vehicle hit and killed Elaine Herzberg as she attempted to cross a road with her bicycle. Reports have since suggested the car had software faults, specifically that the system couldn't distinguish Herzberg from the bike. However, there was also a driver in the car, who was said to be watching a TV show on his phone at the time of the crash.
It's incidents such as this that the report is looking to address. It states that having a system that still requires "passive" attention is flawed, and recommends that a new "authorisation" scheme is needed to decide whether an autonomous is or is not 'self-driving' as a matter of law.
This would mean laws will need to change so that the person in the driving seat is no longer classified as a 'driver'. They will be the 'user-in-charge' and have immunity from a range of offences related to the way the vehicle drives, including forms of dangerous driving and exceeding the speed limit.
Other makers of autonomous cars, such as Tesla, have already seen a number of fatalities with their self-driving vehicles. Most recently, two men died when their Tesla veered into a tree and caught fire. An investigation into the crash is still ongoing with law enforcement and Tesla boss Elon Musk disputing whether the car's autonomous features were engaged and, as such, at fault for the crash.
This accident is also relevant to the law commission's report because Tesla allegedly failed to hand over data for the car to the police. Under the proposed laws, such data must be made accessible and there would also be sanctions for car makers that fail to reveal how their systems work.
Activation playbook: Deliver data that powers impactful, game-changing campaigns
Bringing together data and technology to drive better business outcomesFree Download
In unpredictable times, a data strategy is key
Data processes are crucial to guide decisions and drive business growthFree Download
Achieving resiliency with Everything-as-a-Service (XAAS)
Transforming the enterprise IT landscapeFree Download
What is contextual analytics?
Creating more customer value in HR software applicationsFree Download