By: Will MacIlwaine,
Over the past few years, auto manufacturers have been experimenting with autopilot features that, in certain situations, essentially allow a vehicle to drive itself. One such vehicle is the Tesla Model S. A recent software update for the Model S allows it to “use its unique combination of cameras, radar, ultrasonic sensors and data to automatically steer down the highway, change lanes, and adjust speed in response to traffic.”[1] Further, this Tesla model has the ability to search for a parking space once the driver has arrived at his or her destination, and will even parallel park the vehicle on its own.[2]
The Tesla must obtain certain data before it can enter into autopilot mode.[3] Among other things, there must be clear lane lines, a consistent travel speed, and the car must be able to sense other vehicles around it.[4] Tesla points out on its website that, although the vehicle does most of the driving for the consumer, drivers must still keep their hands on the steering wheel.[5] Even so, there have been reports of Tesla drivers taking pictures of themselves with their hands off the steering wheel, drinking coffee, reading the paper, or even riding on the roof, while the car does the driving.[6]
Tesla claims that the autopilot feature can make it easier, safer, and more pleasant to deal with traffic.[7] Some drivers of both Tesla vehicles with autopilot features, and drivers of other similar vehicles, would beg to differ.
This past week, reports surfaced of an accident involving a Tesla vehicle that occurred in January of 2016.[8] The accident took place in China, when the Tesla, thought to be in autopilot mode, failed to brake and slammed into a road sweeper, killing the driver.[9]
Later, in May of this year, a man was killed when the autopilot feature failed to recognize the white side of a tractor-trailer against the bright sky, and the brakes were not applied.[10]
Legally, what’s at stake for Tesla in introducing this innovative feature? The National Highway Traffic Safety Administration (“NHTSA”) classifies car automation by levels ranging from one to four.[11] One is akin to a standard car, while four is parallel with a fully capable autopilot vehicle.[12] According to attorney Gabriel Weiner, Tesla’s autopilot feature is similar to a level two classification.[13] Drivers could be fooled by the term “autopilot” and falsely believe that they do not have to be fully alert when the feature is being used.[14] If this is the case, and Tesla fails to warn the customer to always remain alert, the company could be liable if the autopilot feature caused an accident. On the other hand, Tesla’s owner’s manuals state that the driver is still responsible for controlling the vehicle, and a message reminding the user to keep his or her hands on the wheel and to be prepared to take over at any time is displayed on the vehicle’s center screen when the feature is in use.[15]
If a user sees these messages and decides to ignore them, it would seem that Tesla could escape liability, as this could be seen as an implied assumption of risk by the user. Under that theory, if the vehicle user knows and understands the danger that the autopilot feature presents and still voluntarily chooses to use it, there would likely be no liability on the part of Tesla.
In a claim that Tesla acted negligently in selling a car with the autopilot feature, Tesla could also make contributory negligence argument. By failing to have hands on the steering wheel, or by not paying attention to the road, the driver could be contributorily negligent if an accident were to occur.
There are certainly other questions surrounding the autopilot feature. For one, who is legally responsible for a car crash if the car is driving itself?[16] Tesla? The owner of the car? What implications might autopilot malfunctions have on an owner’s drivers license? Will an owner get points on his license, or worse, lose it, if the autopilot feature causes a crash?
The technological breakthrough that the autopilot feature offers is obviously not perfect. It may take years to fully perfect this advancement in the automotive industry. That being said, the question remains: will consumers continue to utilize this compelling feature while potentially sacrificing safety for convenience?
[1] Model S Software Version 7.0, https://www.tesla.com/presskit/autopilot (last visited Sept. 17, 2016).
[2] See id.
[3] See Ryan Bradley, Tesla Autopilot, MIT Technology Review, https://www.technologyreview.com/s/600772/10-breakthrough-technologies-2016-tesla-autopilot/ (last visited Sept. 17, 2016).
[4] See id.
[5] See Model S Software Version, supra note 1.
[6] See Bradley, supra note 3.
[7] See Model S Software Version, supra note 1.
[8] See Neal E. Boudette, Autopilot Cited in Death of Chinese Tesla Driver, New York Times, Sept. 14, 2016, http://www.nytimes.com/2016/09/15/business/fatal-tesla-crash-in-china-involved-autopilot-government-tv-says.html.
[9] See id.
[10] See Bill Vlasic & Neal E. Boudette, Self-Driving Tesla Was Involved in Fatal Crash, U.S. Says, New York Times, June 30, 2016, http://www.nytimes.com/2016/07/01/business/self-driving-tesla-fatal-crash-investigation.html.
[11] See William Turton, Tesla’s Autopilot Driving Mode is a Legal Nightmare, Gizmodo (July 23, 2016), http://gizmodo.com/teslas-autopilot-driving-mode-is-a-legal-nightmare-1783280289.
[12] See id.
[13] See id.
[14] See id.
[15] See id.
[16] See id.
Photo Source:
https://fortunedotcom.files.wordpress.com/2015/12/gettyimages-492682174.jpg?quality=80