Eleven more crash deaths are linked to automated-tech vehicles including Teslas

18 October 2022, 20:14

Cars wait at a red light during rush hour on the Las Vegas Strip
Automated Vehicles Crash Deaths. Picture: PA

Ten of the deaths involved vehicles made by Tesla.

Eleven additional people were killed in US crashes involving vehicles using automated driving systems during a four-month period earlier this year, according to newly released government data, part of an alarming pattern of incidents linked to the technology.

Ten of the deaths involved vehicles made by Tesla, though it is unclear from the National Highway Traffic Safety Administration (NHTSA)’s data whether the technology itself was at fault or whether driver error might have been responsible.

The 11th death involved a Ford pickup truck.

The deaths included four crashes involving motorcycles that occurred during the spring and summer – two in Florida and one each in California and Utah.

Safety advocates note that the deaths of motorcyclists in crashes involving Tesla vehicles using automated driver-assist systems such as Autopilot have been increasing.

The new fatal crashes are documented in a database that NHTSA is building in an effort to broadly assess the safety of automated driving systems, which, led by Tesla, have been growing in use.

Tesla alone has more than 830,000 vehicles on US roads with the systems. The agency is requiring car and tech companies to report all crashes involving self-driving vehicles as well as cars with driver assist systems that can take over some driving tasks from people.

The 11 new fatal crashes, reported from mid-May until September, were included in statistics that the agency released on Monday.

In June, the agency released data it had collected from July of last year until May 15.

The figures released in June showed six people died in crashes involving the automated systems and five were seriously hurt.

Of the deaths, five occurred in Teslas and one in a Ford.

In each case, the database says that advanced driver assist systems were in use at the time of the crash.

Michael Brooks, executive director of the nonprofit Centre for Auto Safety, said he is baffled by NHTSA’s continued investigations and by what he called a general lack of action since problems with Autopilot began surfacing back in 2016.

“I think there’s a pretty clear pattern of bad behaviour on the part of Tesla when it comes to obeying the edicts of the (federal) safety act and NHTSA is just sitting there,” he said.

“How many more deaths do we need to see of motorcyclists?”

Mr Brooks said the Tesla crashes are victimising more people who are not in the Tesla vehicles.

“You’re seeing innocent people who had no choice in the matter being killed or injured,” he said.

A message was left on Tuesday seeking a response from NHTSA.

Tesla’s crash number may appear elevated because it uses telematics to monitor its vehicles and obtain real-time crash reports.

Other carmakers lack such capability, so their crash reports may emerge more slowly or may not be reported at all, NHTSA has said.

NHTSA has been investigating Autopilot since last August after a string of crashes since 2018 in which Teslas collided with emergency vehicles parked along roadways with flashing lights on.

That investigation moved a step closer to a recall in June, when it was upgraded to what is called an engineering analysis.

In documents, the agency raised questions about the system, finding that the technology was being used in areas where its capabilities are limited and that many drivers were not taking steps to avoid crashes despite warnings from the vehicle.

NHTSA also reported that it has documented 16 crashes in which vehicles with automated systems in use hit emergency vehicles and trucks that were displaying warning signs, causing 15 injuries and one death.

Elon Musk
Tesla and SpaceX CEO Elon Musk (Hannibal Hanschke/Pool Photo/AP)

The National Transportation Safety Board (NTSB), which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate.

The NTSB also recommended that NHTSA require Tesla to improve its systems to ensure that drivers are paying attention. NHTSA has yet to act on the recommendations. The NTSB can make only recommendations to other federal agencies.

Messages were left on Tuesday seeking comment from Tesla. At the company’s artificial intelligence day in September, CEO Elon Musk asserted that, based on the rate of crashes and total miles driven, Tesla’s automated systems were safer than human drivers — a notion that some safety experts dispute.

“At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it,” Musk said.

“Even though you’re going to get sued and blamed by a lot of people. Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they definitely know, or their state does, that it was, whatever, there was a problem with Autopilot.”

Teslas with automated systems have driven more than three million vehicles on the road, Musk said.

“That’s a lot of miles driven every day. And it’s not going to be perfect. But what matters is that it is very clearly safer than not deploying it.”

In addition to Autopilot, Tesla sells “Full Self-Driving” systems, though it says the vehicles cannot drive themselves and that motorists must be ready to intervene at all times.

The number of deaths involving automated vehicles is small compared with the overall number of traffic deaths in the US.

Nearly 43,000 people were killed on US roads last year, the highest number in 16 years, after Americans returned to the roads as the pandemic eased.

Authorities blamed reckless behaviour such as speeding and driving while impaired by drugs or alcohol for much of the increase.

By Press Association

More Technology News

See more More Technology News

The Google logon on the screen of a smartphone

Google faces £7 billion legal claim over search engine advertising

Hands on a laptop

Estimated 7m UK adults own cryptoassets, says FCA

A teenager uses his mobile phone to access social media,

Social media users ‘won’t be forced to share personal details after child ban’

Google Antitrust Remedies

US regulators seek to break up Google and force Chrome sale

Jim Chalmers gestures

Australian government rejects Musk’s claim it plans to control internet access

Graphs showing outages across Microsoft

Microsoft outage hits Teams and Outlook users

A person holds an iphone showing the app for Google chrome search engine

Apple and Google ‘should face investigation over mobile browser duopoly’

UK unveils AI cyber defence lab to combat Russian threats, as minister pledges unwavering support for Ukraine

British spies to ramp up fight against Russian cyber threats with launch of cutting-edge AI research unit

Pat McFadden

UK spies to counter Russian cyber warfare threat with new AI security lab

Openreach van

Upgrade to Openreach ultrafast full fibre broadband ‘could deliver £66bn boost’

Laptop with a virus warning on the screen

Nato countries are in a ‘hidden cyber war’ with Russia, says Liz Kendall

Pat McFadden

Russia prepared to launch cyber attacks on UK, minister to warn

A Google icon on a smartphone

Firms can use AI to help offset Budget tax hikes, says Google UK boss

Icons of social media apps, including Facebook, Instagram, YouTube and WhatsApp, are displayed on a mobile phone screen

Growing social media app vows to shake up ‘toxic’ status quo

Will Guyatt questions who is responsible for the safety of children online

Are Zuckerberg and Musk responsible for looking after my kids online?

Social media apps on a phone

U16s social media ban punishes children for tech firm failures, charities say