Back in 2016, Elon Musk claimed that Tesla cars could "drive autonomously with greater safety than a person. Right now."
![](https://static.wixstatic.com/media/1c4fd3_b46fd263f81c4e44b7573ff905214270~mv2.jpg/v1/fill/w_980,h_515,al_c,q_85,usm_0.66_1.00_0.01,enc_auto/1c4fd3_b46fd263f81c4e44b7573ff905214270~mv2.jpg)
If you read the official notice for Tesla’s recall of more than 2 million vehicles equipped with Autopilot, the thing that jumps out is that it’s not really about a defect in the Autopilot technology itself.
It was a lie, one that sent Tesla’s stock price soaring—and made Musk among the wealthiest people on the planet.
That lie is now falling apart in the face of a new recall of 2 million Teslas, Ed Niedermeyer reported for Rolling Stone.
It's also revealing to the broader public what close observers of Tesla have always known (and the company itself admits in the fine print of its legal agreements): Tesla’s so-called “self-driving” technology works fine—as long as there’s a human behind the wheel, alert at all times.
Tesla’s dangerous and hype-happy approach to driving automation technology has been one of the most important but also one of the most hidden in plain sight. Tesla’s fine print says that the owner bears all legal responsibility for everything the system does.
By telling its customers its cars are almost self-driving and designing them without guardrails, Tesla induces inattention only to blame the victim.
Just like the Mechanical Turk of 1770, everyone has been so focused on the technology itself that they’ve missed the human factors that power the entire spectacle.
Just as worryingly, regulators have missed that forcing humans to babysit incomplete systems introduces entirely new risks to public roads.
If you read the official notice for Tesla’s recall of more than 2 million vehicles equipped with Autopilot, the thing that jumps out is that it’s not really about a defect in the Autopilot technology itself. The problem, strangely enough, has everything to do with humans.
Comments