The Law Office Of David W. HolubThe Law Office Of David W. Holub

The Law Office Of David W. HolubThe Law Office Of David W. Holub

Who Do You Sue When Autopilot Causes Injuries?

Transcript:

Hi, I’m Indiana personal injury attorney David Holub.

This video is inspired by a call where the caller wanted to know if they buy a car with autopilot technology and the tech causes a crash, who is legally responsible?   

The caller did not specifically mention Tesla, but Tesla is back in the news again because one of their cars reportedly was responsible for causing crashes into emergency first-responder vehicles.

Federal safety regulators are investigating at least 11 accidents involving Tesla automobiles using a feature called “Autopilot” that put the car in autonomous mode (meaning the car was driving itself).

The National Highway Transportation Safety Administration released information regarding these accidents. It stated that seven of the accidents resulted in 17 injuries and one death.

The accidents occurred over a 3-1/2 year period from January 2018 to July 2021, and all happened after nightfall. The accidents occurred as the Tesla owners approached post-accident scenes involving first-responder vehicles, flashing lights, and road cones.

According to Tesla’s website…

“Autopilot is an advanced driver assistance system that assists your car with steering, accelerating, and braking for other vehicles and pedestrians within its lane. They assist with the most burdensome parts of driving and work alongside features like emergency braking, collision warning, and blind-spot monitoring.”

Perhaps the computers controlling the autopilot could not adjust to flashing lights surrounding a nighttime accident scene. And since it seems this is a recurring problem that has the attention of the NHTSA, you would think Tesla would make adjustments to this feature.

Tesla’s attorneys likely will argue that the use of the “auto-pilot” feature is the sole responsibility of the vehicle’s driver. But, will warnings and disclaimers protect a carmaker that makes a profit by selling a product that it claims can drive itself? 

Robot cars, or autonomous driving vehicles, are supposed to be very common by 2025. In fact, many manufactures expect that 5G wireless networks will make self-driving cars the primary means of transportation.

For now, we are just seeing occasional headlines about self-driving cars.  Soon we’ll either have one or know of people who have one.

But will a self-driving car be able to avoid collisions without human intervention and do so safely and securely without risking injury to passengers in the event of a crash?

Do you want to risk injury because of a crash caused by a formula that does not recognize the edge of a cliff correctly? Or rear-ends you because you forgot to turn on your headlights (and taillights) when you left the grocery store and it could not discern your car on the highway? Or misreads a red traffic signal as green?

When a crash happens, how do you hold a machine responsible for injuring a person?

My educated guess, and that is all it is, is that the law will hold the vehicle owners legally responsible for mistakes made in driving by the vehicle. And, the law will hold manufacturers accountable for manufacturing defects that contributed to causing the crash.

Right now, when an airbag malfunctions, the carmaker and the component supplier (for example, Takata in the case of Takata airbags) can be held legally responsible.  

I hope you found this information helpful. If you are a victim of someone’s carelessness, please call (219) 736-9700 with your questions. You can also learn more about us by visiting our website at DavidHolubLaw.com. We also invite you to subscribe to our weekly podcast: Personal Injury Primer, where we break down the law into simple terms, provide legal tips, and discuss personal injury law topics.