The recent fatal Tesla car accident in San Francisco that saw a passenger car crashing through a collapsed concrete safety barrier throws serious doubt on self-driving cars as the top trend for automobiles in 2018. Walter Huang, the driver, was an Apple engineer on his way to work when the accident happened. He had complained about the autopilot feature of his car and in fact, told his dealer that the autopilot swiveled to a barrier 7-10 times; the very same one that his Model X hit. Unfortunately, they could not identify the same veering tendency at the dealership.
Tesla confirmed that autopilot was engaged but the driver ignored several prompts and had both hands off the steering wheel moments prior to the crash. In its blog, the company declared that their data shows that Tesla drivers had driven this same highway stretch over 85,000 times from its autopilot roll-out in 2015; and that about 200 successful trips were made on the same road without incidence.
The crash attenuator was also unable to cushion the impact during the collision; something that in proper condition could have saved Huang’s life. Apparently, the barrier Huang’s Tesla slammed against, had collapsed 11 days ago from a similar accident. According to Tesla, “Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,”. Furthermore, both hands were off the steering wheel for around 6 seconds and he had an unobstructed view of the crushed barrier for 5 seconds.
The investigation also showed that there was a break between the asphalt and cement and two white lines where Huang swerved while heading to the 85 carpool lane. According to Sean Price, science director of an environmental start-up (as told to Dan Noyes as they recreated the fatal route in a similar Model X),”I mean, you have to think like a computer, right? A computer doesn’t know. It has no logic, so if it sees a line, it might think that’s a lane.” In effect, the autopilot on the SUV might have guided it to the barrier – and the reason why it displayed the same tendency in the past.
Despite Tesla’s popularity, this was not an isolated incident. Only a month before the Mountain View crash, a Tesla Model S rear-ended a fire truck in Culver, California. A year ago, another driver with his Tesla S autopilot engaged died in a deadly Florida crash. Investigations reveal that the Tesla autopilot system may not be that reliable in predicting crashes.
“Autosteer is not designed to, and will not, steer Model S around objects partially or completely in the driving lane.” — From the Tesla owner’s Manual
There would be many “ifs” in this case, one of which is, whether the 5 seconds have been enough to prevent the crash or at least lessen its impact? That being said, the incident is a grim testament that drivers should not relinquish total control. Present technology is not ready for total autonomous driving but drivers have a false sense of security. They are increasingly not mindful of the dangers of allowing your car complete control of the wheel; especially when you are tired, sleepy, drunk or otherwise unfit to drive. While you can relax a bit, being attentive and responsive to prompts and warnings, keeping your hands on the steering wheel and your feet on the pedals will make driving safer for you and those you encounter on the road. Accountability works both ways: driver and car maker.