Eleven more crash deaths are linked to automated-tech vehicles including Teslas

18 October 2022, 20:14

Cars wait at a red light during rush hour on the Las Vegas Strip
Automated Vehicles Crash Deaths. Picture: PA

Ten of the deaths involved vehicles made by Tesla.

Eleven additional people were killed in US crashes involving vehicles using automated driving systems during a four-month period earlier this year, according to newly released government data, part of an alarming pattern of incidents linked to the technology.

Ten of the deaths involved vehicles made by Tesla, though it is unclear from the National Highway Traffic Safety Administration (NHTSA)’s data whether the technology itself was at fault or whether driver error might have been responsible.

The 11th death involved a Ford pickup truck.

The deaths included four crashes involving motorcycles that occurred during the spring and summer – two in Florida and one each in California and Utah.

Safety advocates note that the deaths of motorcyclists in crashes involving Tesla vehicles using automated driver-assist systems such as Autopilot have been increasing.

The new fatal crashes are documented in a database that NHTSA is building in an effort to broadly assess the safety of automated driving systems, which, led by Tesla, have been growing in use.

Tesla alone has more than 830,000 vehicles on US roads with the systems. The agency is requiring car and tech companies to report all crashes involving self-driving vehicles as well as cars with driver assist systems that can take over some driving tasks from people.

The 11 new fatal crashes, reported from mid-May until September, were included in statistics that the agency released on Monday.

In June, the agency released data it had collected from July of last year until May 15.

The figures released in June showed six people died in crashes involving the automated systems and five were seriously hurt.

Of the deaths, five occurred in Teslas and one in a Ford.

In each case, the database says that advanced driver assist systems were in use at the time of the crash.

Michael Brooks, executive director of the nonprofit Centre for Auto Safety, said he is baffled by NHTSA’s continued investigations and by what he called a general lack of action since problems with Autopilot began surfacing back in 2016.

“I think there’s a pretty clear pattern of bad behaviour on the part of Tesla when it comes to obeying the edicts of the (federal) safety act and NHTSA is just sitting there,” he said.

“How many more deaths do we need to see of motorcyclists?”

Mr Brooks said the Tesla crashes are victimising more people who are not in the Tesla vehicles.

“You’re seeing innocent people who had no choice in the matter being killed or injured,” he said.

A message was left on Tuesday seeking a response from NHTSA.

Tesla’s crash number may appear elevated because it uses telematics to monitor its vehicles and obtain real-time crash reports.

Other carmakers lack such capability, so their crash reports may emerge more slowly or may not be reported at all, NHTSA has said.

NHTSA has been investigating Autopilot since last August after a string of crashes since 2018 in which Teslas collided with emergency vehicles parked along roadways with flashing lights on.

That investigation moved a step closer to a recall in June, when it was upgraded to what is called an engineering analysis.

In documents, the agency raised questions about the system, finding that the technology was being used in areas where its capabilities are limited and that many drivers were not taking steps to avoid crashes despite warnings from the vehicle.

NHTSA also reported that it has documented 16 crashes in which vehicles with automated systems in use hit emergency vehicles and trucks that were displaying warning signs, causing 15 injuries and one death.

Elon Musk
Tesla and SpaceX CEO Elon Musk (Hannibal Hanschke/Pool Photo/AP)

The National Transportation Safety Board (NTSB), which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate.

The NTSB also recommended that NHTSA require Tesla to improve its systems to ensure that drivers are paying attention. NHTSA has yet to act on the recommendations. The NTSB can make only recommendations to other federal agencies.

Messages were left on Tuesday seeking comment from Tesla. At the company’s artificial intelligence day in September, CEO Elon Musk asserted that, based on the rate of crashes and total miles driven, Tesla’s automated systems were safer than human drivers — a notion that some safety experts dispute.

“At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it,” Musk said.

“Even though you’re going to get sued and blamed by a lot of people. Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they definitely know, or their state does, that it was, whatever, there was a problem with Autopilot.”

Teslas with automated systems have driven more than three million vehicles on the road, Musk said.

“That’s a lot of miles driven every day. And it’s not going to be perfect. But what matters is that it is very clearly safer than not deploying it.”

In addition to Autopilot, Tesla sells “Full Self-Driving” systems, though it says the vehicles cannot drive themselves and that motorists must be ready to intervene at all times.

The number of deaths involving automated vehicles is small compared with the overall number of traffic deaths in the US.

Nearly 43,000 people were killed on US roads last year, the highest number in 16 years, after Americans returned to the roads as the pandemic eased.

Authorities blamed reckless behaviour such as speeding and driving while impaired by drugs or alcohol for much of the increase.

By Press Association

More Technology News

See more More Technology News

Ellen Roome with her son Jools Sweeney

Bereaved mother: Social media firms ‘awful’ in search for answers on son’s death

Molly Russell who took her own life in November 2017 after she had been viewing material on social media

UK going ‘backwards’ on online safety, Molly Russell’s father tells Starmer

A remote-controlled sex toy

Remote-controlled sex toys ‘vulnerable to attack by malicious third parties’

LG AeroCatTower (Martyn Landi/PA)

The weird and wonderful gadgets of CES 2025

Sinclair C5 enthusiasts enjoy the gathering at Alexandra Palace in London

Sinclair C5 fans gather to celebrate ‘iconic’ vehicle’s 40th anniversary

A still from Kemp's AI generated video

Spandau Ballet’s Gary Kemp releases AI generated music video for new single

DragonFire laser weapon system

Britain must learn from Ukraine and use AI for warfare, MPs say

The Pinwheel Watch, a smartwatch designed for children, unveiled at the CES technology show in Las Vegas.

CES 2025: Pinwheel launches child-friendly smartwatch with built in AI chatbot

The firm said the morning data jumps had emerged as part of its broadband network analysis (PA)

Millions head online at 6am, 7am and 8am as alarms go off, data shows

A mobile phone screen

Meta ends fact-checking on Facebook and Instagram in favour of community notes

Mark Zuckerberg

Meta criticised over ‘chilling’ content moderation changes

Apps displayed on smartphone

Swinney voices concern at Meta changes and will ‘keep considering’ use of X

sam altman

Sister of OpenAI CEO Sam Altman files lawsuit against brother alleging sexual abuse as child

OpenAI chief executive Sam Altman with then-prime minister Rishi Sunak at the AI Safety Summit in Milton Keynes in November 2023

OpenAI boss Sam Altman denies sister’s allegations of sexual abuse

A super-resolution prostate image

New prostate cancer imaging shows ‘extremely encouraging’ results in trials

Gadget Show

AI will help workers with their jobs, not replace them, tech executives say