peutetre
> It's worth noting that Tesla claims FSD is still in "beta," so it's incomplete

When a broken tail light is considered a safety issue, how is a half-baked "full self-driving" system that's in perpetual beta and has never delivered on Tesla's self imposed deadlines for true self-driving allowed on the road at all?

rogerrogerr
My car drives me all over town without intervention (on 12.3.6, and now on 12.5.4). It’s beyond confusing to be constantly told it doesn’t, when I have never taken over for a safety issue. Only for cases where it’s being too hesitant. 2020 Model 3.

I bought a used Tesla and had very low expectations for FSD’s usefulness, and those were shattered. All my complaints are stylistic - accelerates a bit too quick, hesitates a bit longer in some cases. Steering wheel is a bit jumpy on turns at low speeds, but not unsafe.

The only explanation I can come up with is that there is either a well-motivated group that wants FSD to be bad, or there is some huge variation between vehicles and I lucked out on my car’s cameras being aligned right or something. Maybe the roads in my area are easier than normal, but I’m in a large non-California metro area, so I don’t think I’m unrepresentative.

> AMCI found that, on average, when operating FSD, human intervention is required at least once every 13 miles to maintain safe operation.

I drove a 200 mile road trip last week, combination of city, suburb, interstate, rural. Didn’t take over except to pull in/out of driveways. So this statement is just mind boggling - implies that my drive last week was, like, a 3 or 4 sigma event. But I’ve done it more than once. At some point, it stops being anecdotes and starts being “I simple cannot square your number with my experience”

Last time I said something positive about FSD here, someone suggested I’m simply forgetting the many times it must try to kill me. Debates around this get wild.

I will not reply to anything negative below, because I’ve learned it’s much more fun to have my car drive me around town than argue with people who insist it can’t.

Animats
The article mentions that Tesla now claims they will show a prototype of a self-driving taxi on October 10. After four years of postponements.[1] After fakes going back to 2016.[2] At a movie studio, not on the road.

We will see.

[1] https://www.motortrend.com/news/tesla-robotaxis-will-ready-2...

[2] https://www.cbsnews.com/news/tesla-autopilot-staged-engineer...

porphyra
The Red light at night "failure" [1] is a bit suspect: The car was already in the intersection but was blocked by the stopped car in front of it. When the car in front of it started moving, it moved too, which is the legal and correct thing to do as it was already in the intersection.

[1] https://www.youtube.com/watch?v=Z9FDT_-dLRk

h2odragon
Its amazing how one person changing political opinion destroyed the widespread faith in "self driving" that existed just a few years ago.

5 years ago, saying "I doubt self driving trucks will destroy the careers of truck driving humans" was enough to get one burned as a luddite. Now, not only can it be admitted that the systems that exist require human help; doubts about them are required in civilized conversation of the subject.

Its almost as if the facts of the matter are unrelated to the conversation, and the important point is enforcing tribal signals.

stieg
while far from perfect, after watching all three videos, nothing stood out as egregious. Yep... definitely not perfect and still room for improvement. But this headline feels pretty sensational when backed up with the 3 videos (poorly) linked.
jqpabc123
Tesla's "Full Self Driving" is most definitely NOT full self driving --- according to Tesla.

As further proof of this point, reports are that Tesla has been training a car to drive around a selected movie studio lot for their upcoming robotaxi demo.

And all indications are that this staged production will work as planned to keep investors engaged in the fantasy that has been ongoing for over a decade.

For a realistic robotaxi demo, let's put Elon in the backseat and ask the car to drive him to a randomly chosen destination 10 miles away and see how it goes.

But this won't happen. Because FSD is NOT fsd --- and it is blatantly obvious that Elon knows this as well as anyone.