“Cruise”ing for “Waymo” Lawsuits: Liability in Autonomous Vehicle Crashes

By Caroline Kropka

On October 2, 2023, a driverless vehicle traveled down a San Francisco street.[1] The taxi was one of around 950 autonomous Cruise (a robotaxi service owned by General Motors) vehicles operating across the United States by October of that year.[2]

Ahead, a driver-operated car struck a pedestrian, throwing her into the Cruise’s path. The Cruise braked, unable to avoid hitting the pedestrian, still came to a complete stop. But the Cruise then suddenly pulled out of traffic, dragging the pedestrian twenty feet and eventually pinning her beneath its tire.[3]

Shortly afterwards, the California Department of Motor Vehicles revoked Cruise’s license to operate its service in the state, concluding the robotaxis were an “unreasonable risk to public safety.”[4] In mid-November of 2023, Cruise halted its entire fleet of autonomous vehicles (AVs) amid a safety investigation.[5]

Other complaints lobbied against Cruise’s AVs include that they’ve blocked emergency vehicles; collided with fire trucks; gotten stuck in wet concrete; stalled in traffic en masse; run into potholes; and failed to properly recognize pedestrians, particularly children.[6]

But these problems are not unique to AVs like those used by Cruise or Waymo (Google’s robotaxi service). Human drivers do everything listed above, and, the robotaxi companies argue, they do so at far higher rates than autonomous vehicles.[7] AVs are even immune from fatigue, intoxication, and the number one cause of traffic accidents: distracted driving.[8]

But regardless of their safety compared to human drivers, autonomous vehicles pose a unique issue for someone injured by one. If there is no driver, who is liable? And to what extent is liability affected by the lack of a human driver?

The locus of liability has begun to settle on the manufacturers and designers of the autonomous vehicles[9]—unsurprising, as driverless cars completely unattended by a human driver are now legally allowed on the road in many states.[10] When a human driver is no longer at the wheel to make mistakes, the fault may rest with the programmers of the vehicle’s software.[11] For example, in one of the earliest widely-reported AV crashes, a software bug falsely identified pedestrian Elaine Herzberg as a “false positive” and the vehicle failed to slow down for her; she was struck and killed.[12] Herzberg’s family sued Uber, the owner of the vehicle, but whether Uber or its programmers would have been found liable in civil court isn’t clear—the parties reached a confidential settlement.[13] After a police investigation, Uber was found not criminally liable for Herzberg’s death.[14] But, the vehicle’s safety backup driver was held criminally liable (she had apparently been streaming The Voice instead of monitoring the road) and was sentenced to three years of probation.[15]

Despite this, recent cases involving liability in AV crashes are few and far between. A recent California case involved a lawsuit against the manufacturer after a Tesla on Autopilot mode (with a dozing driver) struck and killed a motorcyclist on the side of the road.[16] However, the case was dismissed for forum non conveniens (“an inconvenient forum”). [17] Even though Tesla was headquartered in California, the court ruled Japan to be the proper forum because plaintiffs (the motorcyclist’s family) were Japanese citizens; the accident occurred in Japan between Japanese citizens; and the Tesla AV was bought in Japan.[18] Because Japan was the appropriate forum, the court did not have authority and had to dismiss the case.

One possible way to predict how liability will come to be applied in crashes involving AVs is to look to how other autonomous technologies are treated.[19] For example, with surgical robots (which partially automate the process of surgery), litigation has extended to the attending surgeons; the robot’s manufacturers; and hospitals.[20] In those cases, courts considered the manufacturer’s duty to warn; damage mitigation by the attending surgeon; superseding negligence by the surgeon; and the availability of expert testimony about the surgeon’s duty of care.[21]

Following this framework, liability might extend to the manufacturer where they fail to adequately warn of the potential dangers of AVs, or where they overstate the capabilities. For example, Tesla has been accused by the California DMV of exaggerating the capability of its Autopilot feature, downplaying the amount of active driver supervision required.[22]

Judicial opinion, at least, appears to treat cases of AV-involved crashes more harshly than those of human driver-only crashes.[23] A recent study asked 531 judges to evaluate a personal injury suit where, in half the hypotheticals, a pedestrian had been struck by a human driver blinded by sunlight; in the other half, the vehicle was an autonomous robotaxi with a human supervisor.[24] Judges tended to assign a statistically significantly higher amount of blame to the supervisor of the self-driving taxi than to the driver of the human-operated car; the judges also allowed the pedestrian to recover for her injuries in a significantly higher percentage of the self-driven taxi cases versus the human-operated car cases.[25]

In another window into the minds of judges when it comes to AVs, the Connecticut Supreme Court opined in a footnote: “This may explain why our automakers are experiencing such difficulty perfecting a self-driving car; bad human judgment causes accidents, but the right kind of human judgment seems essential to good driving.”[26]

In other words: the idea of “good driving” carries with it the image of a human acting. When an AV drives “well”, it is simply operating according to its software—but when it drives “poorly”, it is because it lacks the capacity to execute complex judgment.

When we think of the Cruise taxi suddenly pulling to the side, dragging the pedestrian along with it, it’s easy to see how a good human driver would have avoided this situation.[27] The hypothetical good human driver could decide whether the situation would be more dangerous if she stopped in the middle of the roadway or pulled over, and make that choice. In fact, the quote suggests, we might even be more sympathetic if the human driver executed her judgment and made a poor choice: in a complex situation such as this, “good judgment” may have been within her reach, and she probably believed she was exercising “good judgment” when she chose how to proceed. The AV never had that possibility of attaining “good judgment” in the first place, and presumably some other actor gave up their own human judgment when allowing it to run driverless in the first place.

This all suggests that, whether it is due to skepticism, a misunderstanding (and fear) of technology, the philosophic importance we place on human judgment, or a desire to hold users of new tech more accountable, the judicial system will treat the operator of a self-driving vehicle less favorably than an identically behaving human driver. Thus, a plaintiff injured by an AV may actually be in a better position to recover damages.

Only time (and further litigation) will truly tell how liability will apply to ventures like Cruise. But, as Cruise pulls its entire fleet off the roadways, seeks to hire new safety officers, and even hires an outside firm to perform a third-party analysis of their safety systems, one thing is clear. AV-based companies are feeling exposed to not just an unfavorable verdict in a court of law, but a poor verdict in the court of public opinion as well.[28]

After all, the rate of pedestrian deaths from cars are currently the highest they’ve been since 1981.[29] But unlike the reaction to Cruise’s single documented (non-fatal) collision, the California DMV is not decrying sports utility vehicles—the design of which is one of the major contributors to increased fatalities—as an “unreasonable risk to public safety.”[30] More than anything, our perception of driverless cars might remain the biggest influence on their liability in crashes.

 

 

 

 

 

 

[1] Tom Krishner, General Motors Recalls All Cruise Autonomous Vehicles After Dragging a Pedestrian, Detroit Free Press, https://www.freep.com/story/money/cars/general-motors/2023/11/08/gm-general-motors-recall-cruise-autonomous-vehicle-driverless-vehicles/71500603007/ (Nov. 10, 2023, 8:54 AM).

[2] Id.

[3] Id.

[4] Michael Liedtke, California Regulators Suspend Recently Approved San Francisco Robotaxi Service for Safety Reasons, Associated Press, https://apnews.com/article/driverless-cars-cruise-california-robotaxis-8aa872f6b87bbff59e9c86471e87b0e7 (Oct. 24, 2023, 7:06 PM).

[5] Katyanna Quach, Cruise Parks Entire U.S. Fleet Over Safety Fears, The Reg. (Nov. 15, 2023, 8:45 PM), https://www.theregister.com/2023/11/15/cruise_parks_fleet/.

[6] See Aja Seldon, Waymo Says Its Driverless Cars Safer Than Those Driven By Humans, FOX Television Stations (Sep. 8, 2023), https://www.ktvu.com/news/waymo-says-its-driverless-cars-safer-than-those-driven-by-humans (blocking ambulances; striking a fire truck); Yiwen Lu & Cade Metz, Cruise’s Driverless Taxi Service in San Francisco Is Suspended, N.Y. Times (Oct. 24, 2023), https://www.nytimes.com/2023/10/24/technology/cruise-driverless-san-francisco-suspended.html (driving through wet concrete; stalling); Sam Biddle, Cruise Knew Its Self-Driving Cars Had Problems Recognizing Children—and Kept Them On the Streets, The Intercept (Nov. 6, 2023), https://theintercept.com/2023/11/06/cruise-self-driving-cars-children/ (failing to avoid potholes; being unable to recognize children).

[7] See Louise Zhang, Human Ridehail Crash Rate Benchmark, Cruise (Sep. 27, 2023), https://getcruise.com/news/blog/2023/human-ridehail-crash-rate-benchmark/ (presenting data showing Cruise AVs had 65% fewer collisions per million miles than human-operated ridehail vehicles); First Million Rider-Only Miles: How the Waymo Driver Is Improving Road Safety, Waymo (Feb. 28, 2023), https://waymo.com/blog/2023/02/first-million-rider-only-miles-how.html (stating that in its first million driverless miles, Waymo AVs experienced 18 minor-contact crashes, with human driver error present in all 18 of those events).

[8] See Distracted Driving 2020, Nat’l Highway Traffic Safety Admin. (2022), at 1.

[9] Jeffrey J. Rachlinski & Andrew J. Wistrich, Judging Autonomous Vehicles, 24 Yale J.L. & Tech. 706, 718–19 (2022).

[10] These states include Florida, Georgia, Nevada, North Carolina, North Dakota, Utah, West Virginia, Texas, and Tennessee (also allowing an unlicensed driver to ride alone in an AV); New Hampshire (requiring a human driver only if the vehicle is in its testing phase); Arkansas, Alabama, Louisiana (allowing only commercial vehicles to operate autonomously). See Justin Banner, Are Self-Driving Vehicles Legal In My State?, MotorTrend (Jan. 6, 2023), https://www.motortrend.com/features/state-laws-autonomous-self-driving-driverless-cars-vehicles-legal/.

[11] Michael L. Rustad, Products Liability for Software Defects in Driverless Cars, 32 S. Cal. Interdis. L.J. 171, 174 (2022).

[12] Timothy B. Lee, Report: Software Bug Led to Death in Uber’s Self-Driving Crash, Ars Technica (May 7, 2018, 6:12 PM), https://arstechnica.com/tech-policy/2018/05/report-software-bug-led-to-death-in-ubers-self-driving-crash/. See also Timothy B. Lee, Autopilot Was Active When a Tesla Crashed Into a Truck, Killing Driver, Ars Technica (May 16, 2019, 1:10 PM), https://arstechnica.com/cars/2019/05/feds-autopilot-was-active-during-deadly-march-tesla-crash/ (non-autonomous Tesla using Autopilot failed to recognize and stop for truck, killing driver); Tesla Driver in First Self-Drive Fatal Crash, Sky News (July 1, 2016, 3:44 PM), https://news.sky.com/story/tesla-driver-in-first-self-drive-fatal-crash-10330121(non-autonomous Tesla using autopilot mistook side of truck for the sky, killing driver).

[13] Scott Neuman, Uber Reaches Settlement With Family of Arizona Woman Killed by Driverless Car, NPR (3:23 AM), https://www.npr.org/sections/thetwo-way/2018/03/29/597850303/uber-reaches-settlement-with-family-of-arizona-woman-killed-by-driverless-car.

[14] Uriel J. Garcia, No Criminal Charges for Uber in Tempe Death; Police Asked to Further Investigate Operator, AZCentral, https://www.azcentral.com/story/news/local/tempe/2019/03/05/no-criminal-charges-uber-fatal-tempe-crash-tempe-police-further-investigate-driver/3071369002/ (Mar. 6, 2019, 10:52 AM).

[15] The driver was charged with negligent homicide, but pled guilty to the reduced charge of endangerment. David Shepardson, Backup Driver in 2018 Uber Self-Driving Crash Pleads Guilty, Reuters (July 28, 2023, 6:12 PM), https://www.reuters.com/business/autos-transportation/backup-driver-2018-uber-self-driving-crash-pleads-guilty-2023-07-28/.

[16] See Umeda v. Tesla, Inc., 2020 U.S. Dist. LEXIS 175286, 2–3 (Cal. N. Dist. Ct. 2020).

[17] Id. at 2.

[18] Id. at 22­–23. However, the Court also imposed conditions to ensure that the plaintiffs could file suit in Japan without losing access to relevant evidence, id. at 23–24.

[19] Madeline Roe, Who’s Driving That Car?: An Analysis of Regulatory and Potential Liability Framework for Driverless Cars, 60 B.C. L. Rev. 317, 320 (2019).

[20] Id. at 328.

[21] See id. at 329–31; see also Taylor v. Intuitive Surgical, Inc., 389 P.3d 517, 520 (Wash. 2017) (stating that a manufacturer of a robot surgical device had a duty to warn hospitals and physicians, and extending strict liability to the manufacturer despite surgeon’s failure to follow guidelines for use).

[22] See Steven Musil, Tesla Accused of Falsely Advertising Autopilot and Self-Driving Features, CNET (Aug. 7, 2022),  https://www.cnet.com/roadshow/news/tesla-accused-of-falsely-advertising-autopilot-and-self-driving-features/.

[23] See Rachlinski & Wistrich, Judging Autonomous Vehicles, supra note 9, at 749–50.

[24] Id. at 745–46.

[25] Id. at 748. The judges assigned an average of 52% fault to the self-driving car versus 43% to the human-operated car. In 67% of the self-driving car responses, the pedestrian would have been able to recover damages; in the human driver responses, she would have only been able to recover in 51% of the responses.

[26] Borelli (Estate of Giordano) v. Renaldi, 336 Conn. 1, 124 n.54 (2020).

[27] Such a thought process is obviously appealing; after all, almost three quarters of drivers consider themselves to be above-average drivers. See Ellen Edmonds, More Americans Willing to Ride in Fully Self-Driven Cars, AAA Newsroom (Jan. 24, 2018), https://newsroom.aaa.com/2018/01/americans-willing-ride-fully-self-driving-cars/. Judges, who aren’t fully automated (yet), are presumably just as susceptible to this line of thinking as the rest of us human beings.

[28] See Jamie L. LaReau, Under Fire, GM’s Cruise Suspends Supervised Car Trips, Expands Safety Investigation, Detroit Free Press, https://www.freep.com/story/money/cars/general-motors/2023/11/15/gm-general-motors-cruise/71586950007/ (Nov. 15, 2023, 4:16 PM).

[29] Juliana Kim, U.S. Pedestrian Deaths Reach a 40-Year High, NPR (June 26, 2023), https://www.npr.org/2023/06/26/1184034017/us-pedestrian-deaths-high-traffic-car. In 2022, over 7,500 pedestrians were killed by cars.

[30] See Joel Rose, Taller Cars and Trucks Are More Dangerous for Pedestrians, According to Crash Data, NPR (Nov. 14, 2023, 5:00 AM), https://www.npr.org/2023/11/14/1212737005/cars-trucks-pedestrian-deaths-increase-crash-data (“Vehicles with higher front ends and blunt profiles are 45% more likely to cause fatalities in crashes with pedestrians than smaller cars and trucks”).

 

Image Source: https://commons.wikimedia.org/wiki/File:Waymo_Chrysler_Pacifica_in_Los_Altos,_2017.jpg