In May of 2016, we witnessed an historic event—the first fatal crash involving an autonomous car. On an overcast spring day, Joshua Brown was driving in autopilot mode when his Tesla Model S collided with a tractor hauling a white trailer. As the rig unexpectedly turned in front of Brown, neither nor the autopilot managed to apply the brakes in time. The Model S crashed into the truck at 65 mph, then smashed through a fence, before a pole finally stopped the car. The National Highway Traffic Safety officially opened its investigation of the crash this month to determine if the autonomous software is at fault.

Shortly after the crash, Tesla released a statement claiming autopilot cars are still the safest option on the market for drivers. Autonomous cars, it went, eliminate the common human errors associated with fatigue and distraction that result in 93% of accidents each year. Autopilot had been activated for over 130 million miles before this fatality, said Tesla, noting that this far surpasses the average distance driven in the United States between fatal car accidents—on average, 94 million miles. Over 30,000 people die each year in the United States from car accidents. Tesla is adamant that their technology is the key to saving these lives.

Tesla’s autopilot software is still very new. The Model S was put on the market in beta phase, which means that it has not been fully tested. Some have criticized this move, arguing it calls for the use of human guinea pigs. Tesla responds that it recognizes that the software is not perfect, but it promises that “as more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keepteslacar decreasing.”

Tesla tries to account for the potential risks for using autopilot by having the software remind drivers that that autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while it is in use. The car also continues to monitor that the drivers’ hands are placed on the steering wheel. If no hands are detected, the vehicle will continue to administer reminders to place hands on the steering wheel and with gradually decrease its speed until this is done.

Depending on its outcome, the conclusions from the federal investigations may lead to future litigation. One critical question in future lawsuits might be whether the autopilot’s failure was a foreseeable consequence of an anticipated use—that is, seeing a white truck against an overcast sky. Proper testing, including in this foreseeable case, prior to marketeting the vehicles, a suit might allege, could have uncovered this flaw, allowing it to avoid crashes in this sitation. The key will be whether the investigation uncovers evidence showing that Tesla was negligent for releasing technology to consumers that was not thoroughly tested.

If a large number of vehicles are affected by this defect, the car’s inability to avoid fatal collisions in certain situations might be the grounds for a class action suit. Presumably, the cars, along with their pricy autopilot upgrades, are worth less than they might otherwise be. This diminution of value could serve as the basis for classwide damages, even for purchasers of cars that have not been involved in a collision.

Tesla’s defense will be that the software is advertised and designed as an assisting tool to the driver—driver participation is required. The driver is still required to be prepared to take control of the vehicle at all times, and Tesla has fully disclosed the risks to owners and drivers. By activating the autopilot, they manifest their assumption of any risk. A driver who lets his attention slip with autopilot on is just as liable as a driver who otherwise falls asleep at the wheel.

In this way, Tesla’s approach differs from other autonomous car manufacturers, such as Google, who do not provide the driver with a steering wheel. In Google’s case, it is difficult to see how blame could be pinned on the vehicle occupants.

Whether any lawsuit will be filed remains unclear. However, since the fatal accident in May, there have been at least two other serious accidents reported with Tesla vehicles engaged in autopilot. If you have been injured in a car accident, while engaged in autopilot or not, you may have a claim. Contact Keane Law LLC for a free consultation.