Hours after releasing an update to its Full Self-Driving software beta—which, contrary to its name, does not allow the car to drive itself without human supervision—Tesla has rolled it back.
In a tweet on Sunday afternoon, Tesla CEO Elon Musk said the rollback was due to “some issues” with version 10.3 but did not specify what those issues were. Although the cars of drivers who received the FSD update are supposed to transition to using version 10.2 “temporarily,” the Verge reported that following the rollback some users were unable to access FSD at all.
FSD 10.3, which Tesla began releasing to some drivers on Saturday and Sunday, reportedly added a number of new features.
According to a screenshot of the release notes posted on the r/teslamotors subreddit, these included “FSD Profiles,” which allow drivers to control behaviors such as rolling stops, speed-based lane changes, and following distance. Other new additions were improved creeping speed, improved crossing object velocity estimation, improved vehicle semantic detections, and reduced false slowdowns, among many others.
Musk reassured drivers that pulling the update was not out of the ordinary. Earlier this month, Tesla delayed the release of FSD 10.2 to roughly 1,000
“Please note, this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA, hence public beta,” he said.
Considering that the issues with FSD 10.3 were enough to warrant a rollback, it remains baffling that Tesla allows drivers to test out its beta software in public. The company’s public beta program has drawn criticism and concern from federal authorities, who have said that Tesla needs to address “basic safety issues” before expanding FSD’s capabilities.
It’s possible that problems with Autopilot and Traffic Aware Cruise Control, or TACC, contributed to the rollback. (Only Tesla knows for sure). A driver who received FSD 10.3 said Autopilot and TACC were not available on his car after the update and pointed it out to Musk on Twitter. Musk responded to the driver and said the company was working on resolving the issue.
Another driver on Twitter also stated that his Autopilot wouldn’t engage and added that his Tesla kept warning him about the lack of distance to the car in front of him even though he was observing safe following distances. The driver said that the car “floored the break on the highway when [Autopilot] disengaged.”