Tesla Autopilot Fails, Blames Driver For Recent Crash into Fire Vehicle
Tesla crashes always seem to make the news. The crash that happened earlier in the month in South Jordan on May 12th is no exception. In that crash, a woman driving a Tesla Model S in Autopilot mode crashed into a stopped fire vehicle. According to KSL News, the driver had been driving in the Autopilot mode with no driver input for 80 seconds before the crash happened. Now, investigators are saying that the Tesla was actually accelerating up to 3.5 seconds before the crash happened and the driver hit the brakes only a fraction of a second before impact.
This Tesla, like other premium cars on the market, has what is called Adaptive Cruise Control. This means that while the cruise control (and automatic steering) can be set at a chosen speed by the driver, the car will slow down (or speed up) depending on the speed of the car in front. This is convenient for rush hour traffic and waiting to get on the freeway.
Apparently in this case, the lead vehicle the Tesla had been following — which had caused the Tesla to lower its speed to 55 mph – switched lanes, causing the Tesla to speed up to its pre-set speed of 60 mph. Unfortunately, however, the car (and driver) didn’t notice another vehicle stopped directly in its path: a bright red fire department maintenance truck.
The driver of the car, Heather Lommatzsch, was later cited with a misdemeanor. By her own admission, she was on her phone just prior to the crash happening, but said she thought the car’s automatic braking system would identify the truck in front of her and safely brake before impact. The driver pointed out to law enforcement that during the time she has owned the car, she has used the “semi-autonomous” feature numerous time and on many different types of road. She further said the car provided no warning just before the crash.
In response to the crash, Tesla released the following statement: “Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all times, did not keep her hands on the steering wheel, and she used it on a street with no center median and with stoplight-controlled intersections.”
Unfortunately, the driver of the truck suffered whiplash while Ms. Lommatzsch has a broken foot (hopefully not a Lisfranc injury).
Legally speaking, it would normally be hard to defend someone who confessed they were in the very act of messing with their phone at the time of a crash. And even though the Tesla is “semi-autonomous,” the driver still needs to pay attention and as KSL commenter Gekko stated:
“I don’t really care if Autopilot was on or off. Driver’s gotta drive.”
I totally agree with this.
However, I feel that Tesla bears some of the blame here and is trying to throw this driver under the bus (or fire truck) to cover up what seems to be a vehicle malfunction on its end. Having driven a Tesla and being familiar with its Autopilot functions, I say this for two reasons:
First, the car, equipped as it was with Autopilot technology, should have detected a vehicle in front of it and braked accordingly. We know that it had detected the car it had been following just before that car changed lanes as it slowed down for that car and sped up when that car left its lane. The fact that it did not detect a large fire truck in front of it suggests that the system malfunctioned when it should have been working to protect the car, its passenger(s) and other motorists on the road from a collision. If the system were working, it would have slowed and stopped the car the same way it would have during rush hour when traffic suddenly slows or stops.
Second, Tesla sent out an update over two years ago that automatically installed on its cars and included a pre-crash audible warning and emergency braking if the driver didn’t stop in time. This collision avoidance system apparently did not activate in this case as Tesla says the car was actually accelerating into the truck up until a fraction of a second before impact.
Finally, there’s the human factor where the user, who has used the system over the years and has become familiar and accustomed to it, trusts that the system will continue to work as it has in the past. As we see in the story, the driver is basically saying that the Autopilot feature and the collision avoidance feature have always worked for her. At least up to the time this crash happened.
The auto-driving feature of the Tesla is a big reason why people buy this car. It’s not fair for Tesla to build up this feature as something that enhances safety and convenience and then blame the driver when something goes awry as if they had nothing to do with it.
This raises the question of whether Tesla has received reports of other similar incidents happening. If Tesla is on notice that their system could suddenly stop working, I think they need to warn their customers of the possibility of this happening so they stop putting blind trust in a system that could seriously fail them.
Ron Kramer is a Utah personal injury lawyer with offices in West Jordan handling cases throughout Utah.