close
close

Actually, how bad is Tesla’s fully autonomous driving feature?

Actually, how bad is Tesla’s fully autonomous driving feature?

Tesla’s FSD software can’t travel more than 13 miles without requiring intervention, according to testing firm AMCI.

Only weeks left Tesla’s big RoboTaxi rolloutIt’s where the automaker’s driverless shuttle will be unveiled, and third-party independent research firm AMCI Testing has some bad news that could cloud the event. AMCI has just completed what it claims is the “most comprehensive real-world test” of Tesla’s Fully Self-Driving (FSD) software, which will ostensibly underpin RoboTaxi’s driverless technology, and the results are not reassuring.

AMCI says its testing covered more than 1,600 kilometers of use and, in short, showed that the performance of Tesla’s FSD software was “questionable.” This isn’t Tesla’s first criticism of FSD. Tesla FSD software has been a source of controversy for the automaker for years. Tesla was interested in everything from existence Summoned by California DMV for false advertising It will be investigated by NHTSA.

There have been many incidents involving Tesla’s Autopilot and FSD that we need to create a megathread to keep track of them all. It’s worth noting that Tesla claims FSD is still in “beta” stage, so it’s incomplete, but it’s also selling the feature as a five-figure option on its current EV lineup, allowing owners to opt out of what’s essentially a real-world test. puppets for the system. They need to accept that the system requires driver supervision and, as the name suggests, is not a fully autonomous system today. Still, Tesla is transferring to real-world customers the types of testing that other automakers conduct scientifically, under engineers and supervision. And AMCI’s findings about how reliable FSD is (or rather, isn’t) are just the latest hurdle for Tesla and FSD.

AMCI says so He carried out his tests on Tesla Model 3 With FSD versions 12.5.1 and 12.5.3 in four different driving environments: city streets, rural two-lane highways, mountain roads and highways. AMCI was impressed with FSD’s ability to rely solely on cameras. (Tesla is the only automaker that operates the driver assistance systems targeted by FSD using cameras alone and primarily short-distance parking sensors, rather than a more complex and expensive combination of cameras, sensors, radar, and lidar. It’s much clearer with more backup from Tesla’s camera array an image.) However, AMCI found that on average when operating FSD, human intervention was required at least every 13 miles to maintain safe operation.

“With all the hands-free augmented driving systems and even driverless autonomous vehicles, there is a bond of trust between the technology and the public. When this technology is introduced, the public is largely unaware of the alerts (such as monitoring or auditing) and the technology is considered empirically foolproof. Approaching perfection but falling short creates an insidious and unsafe operator comfort problem, as evidenced by test results,” said David Stokols, CEO of AMCI Global, parent company of AMCI Testing. You cannot reliably rely on its accuracy or logic.”

You can see the full results of the testing for yourself, but here’s AMCI’s summary:

  • More than 1,000 miles driven
  • City streets, two-lane highways, mountain roads and expressways
  • Day and night operation; Backlit to full front sun
  • 2024 Model 3 Performance with Hardware 4
  • Fully Automatic Driving (Supervised) Profile Setting: Assertive
  • Surprisingly talented but also troubled (and sometimes dangerously incompetent)
  • The confidence (and often competence) with which it takes on complex driving tasks makes users believe it is a thinking machine; its decisions and performance are based on a complex assessment of risk (and the user’s well-being).

If you think that the 13-mile interval between situations in which the driver must hold the steering wheel or apply the brakes is pretty good, it’s not just the number of interventions required that matters, but also the way those situations arise. AMCI’s last point is the one that raises the most eyebrows (emphasis theirs): “When errors occur, they are sometimes sudden, dramatic, and dangerous; Under these circumstances, it is unlikely that a driver without their hands on the steering wheel will be able to intervene in time to prevent an accident or possibly death.

To support its report, AMCI released three videos showing some examples of FSD performing unsafely. Tesla has yet to publicly respond to this report, but we won’t hold our breath for that. The automaker may again resort to the idea that the software is still in development. But common sense suggests that a feature bearing the FSD name and future autonomous driving capabilities should be put into the hands of ordinary people. NowThe decisions the system makes or may fail have dire consequences, and AMCI’s tests prove that FSD’s shortcomings are often revealed.