One of the greatest tricks technology companies ever played was convincing their human guinea pig users that they were a privileged group called beta testers.
From novel email software to alternative versions of Twitter to voice-enabled listening devices, such trials are cheap and easy to make available to thousands or millions of customers. It’s a great way to see how a new version stacks up against the old.
Other than some annoying glitches or unfamiliar icons, software beta testing is generally innocuous. The stakes for most apps are far below life and death.
But there’s nothing innocuous about the beta tests being run by Elon Musk, the billionaire C.E.O. of Tesla. He has turned American streets into a public laboratory for the company’s supposed self-driving car technology.
Tesla says that its inaccurately named full self-driving and autopilot modes are meant to assist drivers and make Teslas safer — but autopilot has been at the center of a series of erratic driving incidents.
In public, Mr. Musk sometimes overhypes these technologies on social media and in other statements. Yet Tesla engineers have privately admitted to California regulators that they are not quite ready for prime time.
Tesla’s autopilot mode uses software, sensors and cameras to detect lanes, objects and other vehicles on the road and can steer, brake, accelerate and even change lanes with minimal input from the driver. Full self-driving beta version 9 — available today to just a few thousand Tesla owners — is supposed to assist with more complicated driving on local streets.
Mr. Musk has assured buyers of his electric vehicles that they would have “full self-driving, software, everything,” yet the autos are not fully self-driving, nor do they have anything like a real autopilot.
This kind of experimental technology, in the hands of regular drivers, has caused multiple fiery crashes and may have other fatal flaws, like an inability to distinguish the moon from a yellow traffic light. Autopilot, features of which must be activated by the driver, has come installed in all new Teslas since 2016. The technology is the subject of multiple lawsuits, including allegations of false advertising.
Mr. Musk tweeted this month, “Beta 9 addresses most known issues, but there will be unknown issues, so please be paranoid. Safety is always top priority at Tesla.” Safety may be a top priority at the factory, but out on the public roads, it’s not only Tesla drivers who have a vested interest in the safety of the vehicles.
On Tesla’s quarterly earnings call this week, Mr. Musk appeared to acknowledge that full self-driving is still half-baked. “We need to make full self-driving work in order for it to be a compelling value proposition,” he said of the technology, when asked about the $199 monthly fee to access it when Tesla releases it to a wider swath of drivers.
Tesla drivers may fall victim to a version of what’s known in clinical drug trials as therapeutic misconception, in which trial participants (beta testers, in this case) tend to overlook the potential risks of participating in an experiment, mistakenly regarding themselves as consumers of a finished product rather than as guinea pigs. And with self-driving cars, Tesla owners aren’t the only trial participants.
Consumer Reports has raised serious alarms about the safety of Tesla vehicles using the automated systems. Videos of full self-driving in action “don’t show a system that makes driving safer or even less stressful,” said a Consumer Reports official. “Consumers are simply paying to be test engineers for developing technology without adequate safety protection.” This is simple: The cars are a hazard to pedestrians, cyclists and other drivers. Which makes it all the more alarming that the internet is full of videos of Tesla drivers reading books, checking email, leaving the driver’s seat or snoozing behind the wheel.
In other words, Teslas appear to be a risk to drivers and others on the road when a computer is behind the wheel. The National Transportation Safety Board has criticized autopilot for lacking proper means to prevent driver misuse and effective driver monitoring systems. That should have all Americans concerned that their public streets are a testing ground.
Competitors like General Motors Co.’s Cruise and Alphabet’s Waymo have taken a more measured approach, putting paid employees behind the wheel as a safety check while the cars are tested in real-world environments. At least they have no misconceptions about what’s going on. Unlike Teslas, those vehicles are easily identifiable as prototypes on the road, giving drivers of other cars a chance to steer clear.
When engineers say the autonomous systems aren’t yet ready, regulators should listen. Only this year did the National Highway Traffic Safety Administration begin requiring tracking and regular monthly reporting of crashes involving autonomous vehicles, perhaps a step toward more regulation. The agency has also ongoing investigations into about three dozen crashes involving vehicles using driver-assistance systems. The vast majority of those involved Teslas, including 10 fatalities.
Tesla didn’t respond to a request for comment.
Self-driving vehicles hold tremendous promise to improve traffic safety. Humans are surprisingly bad at driving. Autonomous vehicles don’t drink and drive, and one day they may be able to see better than the human eye, to respond more quickly to sudden movements from other cars on the road and to lower costs for long-haul trucking operations, among other benefits. But the technology isn’t there yet.
Putting it on the road before it is ready risks not only lives now but also swift public acceptance of the technology down the road when it is ready. If Tesla wants to run beta tests with human guinea pigs, it should do so on a closed track. We’d all feel safer.
Greg Bensinger, a member of the editorial board.