The bill finally comes due for Elon Musk
Source: The Verge
For almost as long as he’s been CEO of Tesla, Elon Musk has been bullshitting us about self-driving cars.
In 2016, he said Tesla self-driving cars were “two years away.” A year later, it was “six months, definitely,” and customers would be able to actually sleep in their Tesla in “two years.” In 2018, it was still a “year away” and would be “200 percent safer” than human driving. In 2019, he said there would be “feature complete full self-driving this year.” There hasn’t been a year go by without Musk promising the imminent arrival of a fully driverless Tesla.
This week, it’s finally here. Or at least that’s what Musk says.
On October 10th, Tesla will reveal its long-awaited “robotaxi,” a supposedly fully autonomous vehicle that Musk has said will catapult the company into trillion-dollar status. It will be some combination of “Uber and Airbnb,” Musk said during a recent earnings call, allowing Tesla owners to serve as landlords for their driverless cars as they roam about the cityscape, picking up and dropping off strangers. And it will be futuristic in its design, with Bloomberg reporting that it will be a two-seater with butterfly wing doors. Musk has been calling it the “Cybercab.”
The event, which will be held on the film lot of Warner Bros. in Burbank, California, will be the culmination of almost a decade of blown deadlines and broken promises from Musk, a moment when the richest man in the world will finally be forced to stop hiding behind his own bluster and actually show us what he’s been working on.
It’s a vulnerable time for Tesla. The company’s sales slumped in the first half of the year, as rising competition in the US and China dimmed Tesla’s star. Musk is fighting to reclaim his enormous $56 billion pay package, all while spreading misinformation on his social media platform and stumping for former President Donald Trump. And now there’s this product event, Tesla’s first since the unveiling of the Cybertruck in 2019.
Almost a decade of blown deadlines and broken promises
Based on past Tesla events, don’t expect Musk to follow through on all his promises.
It seems likely that we’ll see a cool demo of a stylish-looking prototype, allowing Musk to claim a kind of victory for first impressions, even when the rough outlines of what he promises will barely hold up to scrutiny. The exaltations from bullish investors will give him enough cover to continue to make misleading declarations about what is and isn’t autonomous. And the safety experts and competitors who try to warn about the dangers of his approach will likely be drowned out or dismissed by his most ardent fans.
But either it works or it doesn’t. Waymo and others have already shown the world what real driverless technology looks like. It’s imperfect and it’s limited, but it’s undeniable. If Musk fails to deliver or shows off some obvious vaporware, his reputation — and Tesla’s stock price — could take a real hit.
“He’s really grasping for straws,” said Mary “Missy” Cummings, a robotics expert and former senior safety official at the National Highway Traffic Safety Administration. “He’s so desperate to try to drive more money into this equation that he’s doing things like this [event].”
“The hardware needed”
I first started covering Tesla for The Verge in 2016, the same year that Musk made one of his first predictions about the imminent arrival of self-driving cars. “You’ll be able to summon your car from across the country,” he said, citing as an example a Tesla owner beckoning their vehicle to drive solo from New York to meet him in Los Angeles. The company went even further in a blog post, boasting that “all Tesla vehicles produced in our factory — including Model 3 — will have the hardware needed for full self-driving capability at a safety level substantially greater than a human driver.”
That post has since been deleted from Tesla’s site, along with the company’s first “Master Plan,” as Musk attempts to scrub Tesla’s past of all his overreaching pronouncements.
“Substantially greater than a human driver”
But more importantly, these kinds of statements fooled a lot of people into thinking the shiny new electric car in their driveway would have everything they needed to be fully autonomous and that those futuristic capabilities were just around the corner. Elon Musk would flip the switch and — presto — millions of cars would suddenly transform into robots. The media bought into it, portraying Tesla as being on the cusp of a historical evolution. And soon enough, the company’s stock started reflecting this attitude, especially after Tesla defied expectations with the Model 3.
Of course, none of it was true. Nearly a decade later, no Tesla vehicle on the road today is autonomous. Sure, the company has rolled out a series of brashly branded driver-assist features — first Autopilot, then Navigate on Autopilot, then Full Self-Driving, and finally Full Self-Driving (Supervised) — but they do not enable the car to drive without constant human supervision.
You can’t sleep in your Tesla. You can’t summon it across town, let alone across the country. If you crash, you will be liable for what happens and who gets hurt. And if you attempt to fight the company on any of that, you will probably lose.
You can’t sleep in your Tesla
Even those Tesla owners lured into thinking their vehicles were incognito robots would soon realize the cost of the company’s obfuscations. In 2021, Tesla first started offering subscriptions to its long-awaited Full Self-Driving feature, including a $1,500 hardware upgrade for those early owners who were wrongly informed that their vehicle would have “the hardware needed” for full autonomy. (It was later lowered to $1,000 after customer outcry.)
There are plenty of people using Full Self-Driving (Supervised) today who will happily tell you how great it is and how they can’t imagine life without it. (Many also have YouTube channels they want to promote.) They will also argue over the semantics of autonomy. Shouldn’t something that controls the acceleration, braking, steering, and navigation also get to be called autonomous?
In the absence of data from Tesla, it’s impossible to say how good or terrible FSD is with any certainty. Crowd-sourced projects like FSD Community Tracker are extremely limited, only featuring data on a scant 200,000 miles of driving. Tesla says over 1 billion miles have been driven using FSD. But even the tracker’s tiny snapshot of data shows 119 miles between critical disengagements. Waymo drove 17,000 miles between disengagements in 2023, according to the California DMV.
Illustration by Cath Virginia / The Verge | Photo by Grzegorz Wajda, Getty Images
While Tesla chased a much broader vision, Waymo leapt forward by realizing something more workable: remove the driver entirely and restrict the geography in which the vehicle can operate. Google, from which Waymo spun out in 2016, has long argued that advanced driver-assistance systems like Autopilot and FSD were inherently problematic. After all, human supervisors get bored and eventually zone out. The handoff between the vehicle and the driver can be fraught. It’s better to just cut the human out of the equation altogether.
Tesla is now latching onto Waymo’s better vision in unveiling a fully autonomous vehicle, the robotaxi. This is the vehicle that can silence all of those doubters. After all, Waymo doesn’t sell cars; it sells a service. Tesla sells cars. And wouldn’t it be infinitely cooler to own your own self-driving vehicle?
“You’re killing people”
Tesla likes to say that Autopilot — and later FSD — is saving lives. In fact, Musk has gone even further, declaring any criticism of its driver-assistance products amounts to murder. “You need to think carefully about this,” he said in 2016, “because if, in writing some article that’s negative, you effectively dissuade people from using an autonomous vehicle, you’re killing people.”
At the same time, he said that Tesla had no plans to assume legal liability for crashes or deaths that occurred when Autopilot was in use unless it was “something endemic to our design.”
Even in the annals of Musk quotes that have aged poorly, these rank up there. At the time, only one person had died while using Autopilot — a reflection, perhaps, of the small number of Tesla vehicles on the road. Now, there are over 2 million Teslas all over the globe and a substantially higher number of deaths.
“Something endemic to our design”
Presently, federal regulators are investigating at least 1,000 individual Tesla crashes involving Autopilot and FSD. Of those crashes, at least 44 people died. Investigators found that Autopilot — and, in some cases, FSD — was not designed to keep the driver engaged in the task of driving. Drivers would become overly complacent and lose focus. And when it came time to react, it was too late.
Tesla has pushed out numerous updates to FSD over the years, so it can be tough to pin down what exactly is wrong with Tesla’s approach. Often, users flag a problem — the vehicle fails to recognize certain signage or a specific driving maneuver — and almost just as quickly, Tesla has an update available. That seems like a good thing — Tesla is responsive to problems and moves quickly to fix them — until you remember that real people’s lives are at stake. And the pedestrians and cyclists outside the vehicle never consented to participating in this experiment to teach cars to drive themselves.
Even the most recent version of the FSD software has its faults. An independent research firm recently tested versions 12.5.1 and 12.5.3 for over 1,000 miles and found it to be “surprisingly capable, while simultaneously problematic (and occasionally dangerously inept).” When errors occur, “they are occasionally sudden, dramatic, and dangerous.” In one instance, the group’s Tesla Model 3 ran a red light in the city during nighttime even though the cameras clearly detected the lights.
FSD is the foundation for the robotaxi. Everything has been leading up to this moment. But the system struggles with basic perception issues, like wet roads and sunlight glare. FSD struggles to recognize motorcyclists: a 28-year-old motorcycle owner was killed outside of Seattle earlier this year by a Model S driver who was using the driver-assist feature.
The system struggles with basic perception issues
Tesla used to publish quarterly safety reports that it would claim proved that Autopilot was safer than regular human driving — but it then stopped suddenly in 2022. It started up again this year with a new report that says there is only one crash for every 6.88 million miles of Autopilot-assisted driving, versus one for every 1.45 million miles of non-Autopilot driving. That’s over four times safer than normal human driving, according to Tesla.
This is the only safety data we have for Tesla’s driver-assist technology that is supposed to be a precursor to the fully autonomous robotaxi. But according to Noah Goodall, a civil engineer who has published several peer-reviewed studies about Tesla Autopilot, the company’s safety reports fail to take into account basic facts about traffic statistics, such as that crashes are more common on city roads and undivided roads than on the highway, where Autopilot is most often used. And it led him to the conclusion that Tesla may be miscounting crashes in order to make Autopilot seem safer than it actually is.
“They fell apart pretty quickly, once you dove in just a little bit,” Goodall told me. “I have trouble publishing on this sometimes. Just because the reviewers are like, ‘Everyone knows these are fake, why are you pointing this out?’”
Image: Cath Virginia / The Verge, Turbosquid
“A monumental effort”
If there’s one thing on which everyone can agree, it’s that Tesla has a lot of data. With nearly 5 million drivers on the road globally, each vehicle is sending huge amounts of information back to the mothership for processing and labeling. Other companies, with only a fraction of the real-world miles, have to use simulated driving to fill in the gaps.
But the sheer volume of data that Tesla is processing is overwhelming. The company relies on a small army of data annotators who review thousands of hours of footage from Tesla owners and the company’s in-house test drivers. And according to Business Insider, those workers are pushed to move quickly through as many images and videos as they can or face disciplinary action. Accuracy is secondary to speed.
“It’s a monumental effort,” Cummings, the robotics expert, said. “People think Teslas are learning on the fly. They have no idea how wrong they are, and just how much human preparation it takes to actually learn anything from the terabytes of data that are being gathered.”
Tesla’s approach to the hardware of driverless vehicles also diverges from the rest of the industry. Musk infamously relies on a camera-only approach, in contrast to the widely used practice of relying on a “fusion” of different sensors, including radar, ultrasonic, and lidar, to power autonomous driving. Musk calls lidar, in particular, a “crutch” and claims any company that relies on the laser sensor is “doomed.” Waymo’s robotaxis are adorned with large, obvious sensors, a style expressly at odds with the sleekness of Musk’s vehicles.
Of course, Tesla does use lidar on its test vehicles, but just to validate FSD. They won’t be going on any customer cars, since lidar is still too expensive. With its tens of thousands of laser points projecting a second, lidar provides a critical layer of redundancy for the vehicle as well as a way to visualize the world in three dimensions.
The idea that you can introduce a fully autonomous vehicle without the full suite of sensors that power every other AV on earth strains credulity for most experts on the technology.
“Why on earth would you want to tie one hand behind your back when you’re solving an almost impossible problem?” said Phil Koopman, an AV expert from Carnegie Mellon University. “And we know it’s going to be big bucks, so don’t skimp on the hardware.”
Image: Cath Virginia / The Verge, Turbosquid
High five
What is an autonomous car? It sounds like a simple question, but the answer is trickier than it seems. To help clear things up, SAE International, a US organization that represents automotive engineers, created a six-step guide to automation. Intended for engineers rather than the general public, it ranged from Level 0, meaning no automation whatsoever, to Level 5, meaning the vehicle can drive itself anywhere at any time without any human intervention.
And there’s plenty of room for error and misunderstanding. A problem we’ve seen is what researcher Liza Dixon calls “autonowashing,” or any effort to overhype something as autonomous when it’s not.
Most experts dismiss Level 5 as pure science fiction. Waymo and others operate Level 4 vehicles, but very few people really believe that Level 5 is attainable. Level 5 would require “an astronomical amount of technological development, maintenance, and testing,” Torc Robotics, a company developing self-driving trucks, says. Others call it a pipe dream.
Except Musk. At a conference in Shanghai, Musk said with supreme confidence that the company “will have the basic functionality for Level 5 autonomy complete this year.” That was in July 2020.
He’ll likely try to pass off the Tesla robotaxi as the endpoint of this achievement, the vehicle that will help usher in this wildly implausible goal. And it’s important to see through the bluster and bullshit and measure it against what he’s promised in the past and also what other players have already achieved.
Tesla’s history is littered with fanciful ideas that never panned out — like a solar-powered Supercharger network, battery swapping, or robotic snake-style chargers. But Musk never bet his entire company, his reputation, and most importantly, his net worth, on those projects. This one is different. And soon enough, we’ll know whether the Tesla robotaxi is the exception to the rule or just another guy dancing in a robot costume.