Who Will Be at Fault for a Self-Driving Vehicle Crash?
“All of your tires are flat pullover now.” … That’s the message I got from my new car when the temperature plunged to below zero. Of course, the tires were not flat. All that happened was the pressure sensors had detected a pressure change in the tires due to a decrease in the outdoor temperature. Nevertheless, the car expressed extreme concern and seemed genuinely frightened and insistent that I pull over to avoid a crash.
It caused me to start thinking. Do people really think that reliable autonomous cars that drive themselves can be created? Certainly, some people do. But, I’m not one of them. Sure, careful programming of tire sensors, so that they report correctly to an onboard computer system that can deal with multiple variables, could be developed. But it will be one of the most complex pieces of machinery ever mass produced, and a myriad of driving conditions will need to be anticipated. Having robots confined to a factory do routine mechanical manipulations to assemble cars and trucks is relatively easy, compared to needing to anticipate all of the possible things that might be encountered on a highway.
What would happen if a freeway were filled with autonomous cars that all of a sudden thought their tires were flat, and they all tried to pull to the side of the road at the same time and just simply stopped? What if just one of hundreds of autonomous cars experienced a breakdown on a freeway? How do you program a robotic vehicle to sense oil on pavement after a tanker truck flips and spills? Semi-tractor trailer combinations require careful handling in high wind conditions and in slick pavement conditions, how is the environment sensed, and how is the data manipulated by the vehicle’s master control CPU? Would all robot cars react the same to fog and just come to a stop, or would high end luxury cars with infrared sensors to see through the fog do better? What if the high-end cars are blocked by el cheapo robot cars without infrared sensors?
Locally, the other day there was a traffic jam due to a high power line breaking and cascading down upon the roadway. How to you program a robot car to deal with such an extremely rare highway danger? How do you declare a robot car fit to drive if it cannot deal with rare dangers such as downed trees, or rock slides? How do you deal with a bunch of cars that have become confused and misinterpreted their environment, and think movement is unsafe, when a human would be able to safely avoid a road hazard that has caused the robot to shut down? Seems like it would be much easier to program a spaceship to circle Mars, take some pictures, and come back to Earth, compared to just “training” a car to know when to dodge a pothole in order to avoid a collision and when at other times to just power through the pothole. What should a robot car do when a rain starts pouring violently, or a wind gust causes a highway hazard?
Then we have the big questions: can a robot car be negligent? Is the owner responsible for a car’s negligent driving, or the manufacture of the car liable for a car that carelessly hurts someone? Or, both?
Is there utility in attempting to develop a self-driving car or truck? Perhaps to some people there is, but way too many people over estimate the ability of humanity to develop such a product and make it safe for mass use. Consider just the warning label you would have to put on it before letting people buckle up for a ride.
But in any case, if you’re in a crash with a self-driving car or truck, or a negligently and carelessly driven ordinary car, truck, bus, motorcycle, bicycle, or other type of motorized vehicle, and you have been hurt due to the fault of someone else, give us a call at +1 (219) 736-9700. We’ll try to sort things through with you and help if we can.