A recent report reveals the extreme testing methods Tesla is using to develop its Full Self-Driving (FSD) technology, particularly through a project called “Rodeo.” Based on interviews with former Tesla test drivers, it appears that Tesla’s approach to testing its autonomous software involves high-risk scenarios that is potentially endangering for both the drivers and the public. While Tesla customers testing FSD have received significant attention, the company also runs an internal testing team, often tasked with intentionally delaying interventions to observe how FSD manages difficult driving situations.
Now there is a new report by Business Insider after interviewing nine of those test drivers who are working on a specific project called ‘Rodeo.’ They describe the project:
Test drivers said they sometimes navigated perilous scenarios, particularly those drivers on Project Rodeo’s “critical intervention” team, who say they’re trained to wait as long as possible before taking over the car’s controls. Tesla engineers say there’s a reason for this: The longer the car continues to drive itself, the more data they have to work with. Experts in self-driving tech and safety say this type of approach could speed up the software’s development but risks the safety of the test drivers and people on public roads.
Also the experiences of these drivers creates a tense picture,one of those former test drivers described it as “a cowboy on a bull and you’re just trying to hang on as long as you can”- hence the program’s name. Another driver recalled a near-collision with a cyclist, where the car lurched toward the rider, who had to jump off his bike to avoid an accident. The driver stomped on the brakes, only to be told by supervisors that it was the exact reaction they were hoping to record.
Tesla’s unique approach contrasts with other companies in the self-driving sector, like Waymo, which also runs “critical intervention” tests but confines them to controlled environments with dummies rather than real-world situations. Critics and self-driving experts argue that Tesla’s method might accelerate the development of FSD, but it also presents heightened risks, particularly as the cars are tested on public roads alongside pedestrians and cyclists.
While Tesla’s FSD software has made strides in autonomous driving, these revelations raise ethical questions about the extent to which the company prioritizes data over safety. And while this approach might capture more real-world scenarios, it appears that the line between testing and public safety may be dangerously thin.