Let’s get you startedSign in or create an account to view your personalized quotes.
By continuing, you agree to Insurify's Terms & Conditions.

Who Pays When a Driverless Vehicle Crashes?

New car technology is making crash liability more complicated. Regulators and courts must determine if humans, software, or manufacturers are at fault.

Doug Bailey
Written byDoug Bailey
Doug Bailey
Doug BaileySenior Content Writer

Doug Bailey is a senior content writer at Insurify. Doug is an experienced business writer having worked more than a decade as a reporter and business editor at the Boston Globe, covering financial services and the insurance industry. Most recently, Doug was a regular contributor to InsuranceNewsNet, a news and information service for the insurance and financial industry.

Doug is a native New Englander hailing from Maine and works in Insurify’s Cambridge office.

Chris Schafer
Edited byChris Schafer
Chris Schafer
Chris SchaferDeputy Managing Editor, News and Marketing Content
  • 15+ years in content creation

  • 7+ years in business and financial services content

Chris is a seasoned writer/editor with past experience across myriad industries, including insurance, SAS, finance, Medicare, logistics, marketing/advertising, and many more.

Featured in

media logomedia logomedia logomedia logo
John Leach
Reviewed byJohn Leach
Photo of an Insurify author
John LeachLicensed P&C Agent, Chief Copy Editor
  • Licensed property and casualty insurance agent

  • 10+ years editing experience

  • NPN: 20461358

John is Insurify’s Chief Copy Editor, helping ensure the accuracy and readability of Insurify’s content. He’s a licensed agent specializing in home and car insurance topics.

Featured in

media logo

Published

Advertiser Disclosure

At Insurify, our goal is to help customers compare insurance products and find the best policy for them. We strive to provide open, honest, and unbiased information about the insurance products and services we review. Our hard-working team of data analysts, insurance experts, insurance agents, editors and writers, has put in thousands of hours of research to create the content found on our site.

We do receive compensation when a sale or referral occurs from many of the insurance providers and marketing partners on our site. That may impact which products we display and where they appear on our site. But it does not influence our meticulously researched editorial content, what we write about, or any reviews or recommendations we may make. We do not guarantee favorable reviews or any coverage at all in exchange for compensation.

Why you can trust Insurify: Comparing accurate insurance quotes should never put you at risk of spam. We earn an agent commission only if you buy a policy based on our quotes. Our editorial team follows a rigorous set of editorial standards and operates independently from our insurance partners. Learn more.

Share

When a crash occurs, auto insurance usually follows a clear process: determine who caused the accident and pay damages.

But things get complicated when driver-assistance systems like Tesla Autopilot, Ford BlueCruise, GM Super Cruise, or driverless cars like Waymo are involved.

Was the driver the human, the software, or both? The answer affects who pays — the driver’s insurer, the automaker’s insurer, a supplier, or a combination of them.

“Technology in modern vehicles is advancing faster than most laws can adapt,” writes attorney Alex Stalvey in his blog for the law firm Bannister, Wyatt & Stalvey LLC. “These systems blur the line between driver and machine. This has become especially prevalent in the courtroom and at the negotiating table in recent months.”

Vehicle “driver-assist” and self-driving technologies use the SAE (Society of Automotive Engineers) Levels of Driving Automation, from Level 0 to Level 5. Level 0 is no automation. Level 5 is full automation, which doesn’t exist on the road — yet.

Level 2 driver-assist systems are common in the market

Many cars today use Level 2 driver-assist systems, which involve partial or fully “hands-free” systems that the driver must supervise. The cars can steer, brake, and accelerate in certain situations.

The National Highway Traffic Safety Administration requires manufacturers and operators to report certain crashes involving automated driving systems (ADS) and Level 2 ADAS. This helps the agency find patterns and investigate possible defects.

But this reporting policy has spawned greater debate. A system labeled “assist” means the human driver is still legally responsible. But more plaintiffs now argue that the technology can lead to overreliance. They contend that the marketing of these systems implies the vehicle has some control and thus must be held responsible in accidents.

“We are planning toward a day when the lawsuits will be directed against hardware and technical builders,” says Rami Sneineh, vice president and licensed insurance producer at Insurance Navy in Chicago. “The physical operation of the car is being transferred to the data center. Liability is no longer on who struck the brakes, but who composed the code.”

SAE Levels of Driving Automation

Level 0No automationIncludes basic safety warnings — blind spots, lane-departure warnings — but doesn’t take control.
Level 1Driver assistanceSystems that help with a single task, such as adaptive cruise control and lane-keeping assist.
Level 2Partial automation (includes most “hands-free” systems)Car simultaneously controls steering and speed, but drivers must stay attentive and prepared to take over control instantly.
Level 3Conditional automation (very limited availability)Vehicle drives itself in specific conditions. The human driver can look away from the road but must take over when prompted.
Level 4High automationCar drives, without human backup, within a geofenced area or defined locations and conditions, such as mapped cities or specific zones.
Level 5Full automation (no consumer product yet)The car drives itself everywhere, all the time. No steering wheel, no pedals.

A jury tags Tesla with punitive damages in an Autopilot crash

An example of this shift occurred following a 2019 Key Largo crash involving Tesla’s Autopilot. One vehicle occupant died, and a second was badly injured in the crash. 

A jury found Tesla partly responsible for the accident, despite the vehicle having a Level 2 system. The courts awarded more than $240 million in damages, including $200 million in punitive damages, to the victims and their families.

The Tesla driver, who caused the accident, admitted he was distracted by his cell phone, but the jury still decided Tesla shared responsibility. The plaintiffs argued that Autopilot’s design and marketing played a role in the crash.

Punitive damages are rare in product cases like this. But courts and juries may be considering whether the technology can make drivers too relaxed. A 2024 study by the Insurance Institute for Highway Safety called on automakers to improve safeguards so that drivers don’t overly rely on automation for safety.

Ford BlueCruise and ‘stationary vehicle at night’ crashes

Federal regulators have focused on two crashes involving Ford’s BlueCruise driver-assist system. The two fatal crashes occurred at night and involved the Ford Mustang Mach-E. In both cases, the Mustangs hit stopped vehicles at highway speeds.

Regulators say they’re examining whether the system properly detects stationary vehicles and if its driver-monitoring safeguards are adequate. The investigation is ongoing.

Regulatory investigations aren’t the same as courtroom liability, but they can guide future cases. If investigators find a defect or a known problem that wasn’t fixed, it can lead to lawsuits and, in some cases, recalls.

Waymo and the ‘no driver to blame’ problem

In San Francisco, a cyclist sued when a passenger exiting a Waymo vehicle opened a door into a bike lane, striking her. The passenger contends that Waymo’s Level 4 “Safe Exit” system failed to prevent the incident.

The case is ongoing, with arguments revolving around whether Waymor or the passenger bears primary liability for the incident.

The NHTSA also closed a separate investigation into Waymo after reports of unexpected behavior and minor crashes. They noted that Waymo took steps, including recalls and software fixes, illustrating how “liability” can involve regulators and product safety processes, not just insurance claims.

“Shared responsibility is common in accidents, and when you’ve got something like self-driving cars in a crash, you can expect that it could involve all parties, and that will definitely complicate things,” Jae E. Lee, managing partner at Jae Lee Law in New Jersey, told Insurify.

“Today’s automated systems available to the general public don’t fully automate the process, so the driver is still required to pay attention and be ready to take control,” she says. “When they don’t, they’re at fault.”

If a manufacturer’s defect or a software issue causes an accident, the manufacturer is likely responsible, Lee says. “Our role in the process is using investigation and evidence to establish causation, which can determine if one or all of them are to blame.”

What this means for car insurance claims

While insurers are still the first to pay in most claims, experts say driver-assist technology can affect what happens after the initial claim:

  • Disputes over fault are becoming more technical, with greater demand for vehicle data, system status, camera logs, event data recorders, and phone records. The key questions are: Was the system on? Did it give a warning? Did the driver react?

  • If your insurer pays your claim but believes an automation system defect played a role, it may seek to recover costs from a manufacturer or supplier. This process can take months or years and may not change your immediate payout, but it can affect long-term costs.

  • Premiums and underwriting questions are under more pressure. As lawsuits and claims costs go up, insurers may ask for more details, such as what system you used, if your software was current, and if you followed the system’s limits.

“This is becoming a product liability issue that could involve layered responsibility,” Ryan Perdue, partner and trial attorney at Simon Perdue Law in California, told Insurify. “The data points make this case more complex because autonomous and semi-autonomous vehicles generate enormous amounts of information, from sensor logs to driver monitoring data, all of which becomes central evidence.”

“Litigation often becomes a battle between engineers and software experts rather than eyewitness testimony,” he says. “The questions become ‘What did the car know, and how did it decide?’ That shift has major implications for how juries understand fault and causation.”

Perdue says driverless systems could gradually push the insurance industry to a product-centered liability model. If the vehicle is operating autonomously, insurers may pursue manufacturers under product defect theories rather than focusing on the policyholder’s conduct.

“We might also start seeing hybrid frameworks where vehicles require human supervision but still require automated decision-making,” he says. “Until full autonomy becomes the norm, the outcome will likely be shared liability arguments with each side pointing to the other’s role in the chain of events.”

What’s next? Practical takeaways for drivers

Experts say drivers operating vehicles that use driver assistance should:

  • Think of it as an advanced cruise control, not a self-driving system. Regulators and courts still consider humans responsible for Level 2 systems.

  • After a crash, document whether the system was on, what prompts they saw, and the road conditions.

  • Don’t assume saying “the car was driving” will remove responsibility. Cases thus far have trended toward shared fault. The Tesla verdict shows that courts can hold manufacturers accountable, but drivers are still usually responsible as well.

The bigger picture is that the U.S. is slowly creating new liability rules as cases happen, through jury verdicts, NHTSA investigations, and more crash reports. Until the law becomes clearer, this debate will keep showing up in insurance claims and, more often, in courtrooms.

Doug Bailey
Written byDoug BaileySenior Content Writer
Doug Bailey
Doug BaileySenior Content Writer

Doug Bailey is a senior content writer at Insurify. Doug is an experienced business writer having worked more than a decade as a reporter and business editor at the Boston Globe, covering financial services and the insurance industry. Most recently, Doug was a regular contributor to InsuranceNewsNet, a news and information service for the insurance and financial industry.

Doug is a native New Englander hailing from Maine and works in Insurify’s Cambridge office.

Chris Schafer
Edited byChris SchaferDeputy Managing Editor, News and Marketing Content
Chris Schafer
Chris SchaferDeputy Managing Editor, News and Marketing Content
  • 15+ years in content creation

  • 7+ years in business and financial services content

Chris is a seasoned writer/editor with past experience across myriad industries, including insurance, SAS, finance, Medicare, logistics, marketing/advertising, and many more.

Featured in

media logomedia logomedia logomedia logo
John Leach
Reviewed byJohn LeachLicensed P&C Agent, Chief Copy Editor
Photo of an Insurify author
John LeachLicensed P&C Agent, Chief Copy Editor
  • Licensed property and casualty insurance agent

  • 10+ years editing experience

  • NPN: 20461358

John is Insurify’s Chief Copy Editor, helping ensure the accuracy and readability of Insurify’s content. He’s a licensed agent specializing in home and car insurance topics.

Featured in

media logo