Will You Entrust Your Life To Tesla Autopilot?

Spread the love

It was reported that a Tesla Model S hit a barrier on the highway near Dallas, Texas. The driver, who fortunately wasn’t injured, first blamed Tesla’s Autopilot for the crash.

We now have footage of the accident and it actually shows a situation that the Autopilot probably shouldn’t be expected to be able to handle, at least not yet. Ultimately, it serves as a reminder not to trust the system without paying attention.

Since under its current form, Tesla’s Autopilot is only a “driver assist” system and drivers are asked to keep their hands on the steering wheel, the responsibility falls on the driver. Of course, that’s unless the Autopilot malfunctions and automatically steers away from the lane and into the side of the road, which is almost what we were led to believe with this latest accident, but that has so far never happened as far as we know.

For anyone hoping to buy a driverless, crash-proof, car anytime soon, this may feel disappointing. What Tesla markets as “Autopilot” is in fact a collection of driver assistance technologies, including active cruise control to control speed, and steering to keep the car in its lane. Other high end cars offer similar features. They’re great for backing up humans whose attention drifts away from the road, but not yet good enough to take over full-time.

It’s fair to remain skeptical when Musk claims that autopilot would save 500,000 lives a year if it were deployed universally. Unless the company were to release all its testing and tracking data, which it declines to do, we can’t possible verify its calculations. One of the few specific figures that the company publicized in its blog post was that autopilot had been safely used in more than 130 million miles of driving before the first fatality, which is a higher ratio of miles to deaths than the U.S. or global averages. But just one more autopilot-related fatality tomorrow would undermine that claim. The math required to demonstrate conclusively that autopilot is safer than human drivers would be more nuanced, examining injury accidents as well as fatalities and controlling for biases such as the recommended use of autopilot predominantly on highways under favorable driving conditions.

What we know at this point is that autopilot can hurt or kill people if used improperly and that it also has the potential to save people. It’s also fair to assume that the technology will get safer over time as Tesla and other companies study and learn from its errors. The only question is whether the public can or should tolerate its rare mistakes in the meantime.

Categories:   Car Crash, Transportation