If the job description says “must remove all sense of legal obligation, moral ethics and humanity” from the job description, would you apply? According to a new report, this is essentially the role of a test driver Tesla.
Self-driving vehicles are the future. If tech companies have their way, there will be autonomous cars Just future. However, many of us live in reality and realize that software is not yet where it should be, given the science-fiction level of automation. But people with money will continue to try and accelerate the process, even if it means beta testing on public roads with all of us as guinea pigs.
Business expert revealed details about Tesla’s dedicated team of test drivers, dubbed Project Rodeo. This test team is trained to push automakers to their limits Full autonomous driving (FSD) AND Autopilot software. What exactly is the limit? Malfunction. However, the general rule is to get as close to colliding with something or someone as possible. The scarier the situation, the better.
“You’re practically on adrenaline for the entire eight-hour shift,” said one former test driver. “You feel like you’re on the verge of something really bad.”
Interviews were conducted with nine current and former Project Rodeo drivers and three Autopilot engineers from California, Florida and Texas. Most asked to remain anonymous. The situations they describe are eye-opening, but not surprising. Though FSD related crashes were well documented, none of the people interviewed were involved.
Project Rodeo is a test group made up of smaller teams. For example, the “Golden Manual” team drives by the rules, obeys traffic regulations, and does not use any driver assistance features. At the other end of this spectrum is the “critical intervention” team. More passengers than drivers, critical intervention testers allow the software to handle all aspects of driving. They engage or “intervene” only to prevent a collision.
One reason test drivers wait until the 11th hour to manually take over is that it gives the software time to react and make a right or wrong decision. The more data collected, especially in real-world scenarios, the easier it is for engineers to customize and update the software.
“We want the data to know what led the car to make this decision,” said a former Autopilot engineer. “If you intervene too early, we won’t really get to the point where we say, OK, we understand what happened.”
However, this means that vehicles may run through red lights, cross double yellowfailing to obey stop signs and speeding – all on public roads. Even if the situation becomes uncomfortable for the driver, superiors will feel that they took control too soon. As a result, Project Rodeo drivers, even those in non-critical intervention roles, felt pressure to maintain risky driving situations, and sometimes create them altogether, in order to test the software in order to maintain their jobs.
John Bernal, a former test driver and data analyst, said he was never told to intentionally break data collection laws, but it was clearly implied. “My training was to wait until the wheels hit the white line before I could hit the brakes,” he said.
Moreover, some drives were used solely to train software to recognize and adapt to “vulnerable road users” such as pedestrians, cyclists and people in wheelchairs. A former tester, riding with his trainer, said their vehicle came within three feet of the cyclist before he hit the brakes.
“I clearly remember this guy jumping off his bike,” he said. “He was terrified. The car lunged at him and all I could do was hit the brakes. Apparently his coach actually was happy about it, telling him that his late reaction was “perfect” and exactly what they expected from him. “I felt like the goal was to almost simulate an accident and then prevent it at the last second.”
Cruise AND Waymo are also developing autonomous cars, but say they conduct rigorous software testing in controlled environments or believe their autonomous systems are “fundamentally different” from Tesla’s. Hmm, so why do these companies have the same problems with vehicles don’t read the roomper se? In case Uber’s autonomous vehicles division is currently closedSometimes the results are killer.
“If you have a parent who holds the bike all the time, the bike will never learn,” said a former Tesla engineer. After all, data is everything. For these autonomous technology companies, now at the mercy of shareholders, this is a high-risk, high-reward environment that society has not agreed to.