I bought a used Tesla and had very low expectations for FSD’s usefulness, and those were shattered. All my complaints are stylistic - accelerates a bit too quick, hesitates a bit longer in some cases. Steering wheel is a bit jumpy on turns at low speeds, but not unsafe.
The only explanation I can come up with is that there is either a well-motivated group that wants FSD to be bad, or there is some huge variation between vehicles and I lucked out on my car’s cameras being aligned right or something. Maybe the roads in my area are easier than normal, but I’m in a large non-California metro area, so I don’t think I’m unrepresentative.
> AMCI found that, on average, when operating FSD, human intervention is required at least once every 13 miles to maintain safe operation.
I drove a 200 mile road trip last week, combination of city, suburb, interstate, rural. Didn’t take over except to pull in/out of driveways. So this statement is just mind boggling - implies that my drive last week was, like, a 3 or 4 sigma event. But I’ve done it more than once. At some point, it stops being anecdotes and starts being “I simple cannot square your number with my experience”
Last time I said something positive about FSD here, someone suggested I’m simply forgetting the many times it must try to kill me. Debates around this get wild.
I will not reply to anything negative below, because I’ve learned it’s much more fun to have my car drive me around town than argue with people who insist it can’t.
We will see.
[1] https://www.motortrend.com/news/tesla-robotaxis-will-ready-2...
[2] https://www.cbsnews.com/news/tesla-autopilot-staged-engineer...
5 years ago, saying "I doubt self driving trucks will destroy the careers of truck driving humans" was enough to get one burned as a luddite. Now, not only can it be admitted that the systems that exist require human help; doubts about them are required in civilized conversation of the subject.
Its almost as if the facts of the matter are unrelated to the conversation, and the important point is enforcing tribal signals.
As further proof of this point, reports are that Tesla has been training a car to drive around a selected movie studio lot for their upcoming robotaxi demo.
And all indications are that this staged production will work as planned to keep investors engaged in the fantasy that has been ongoing for over a decade.
For a realistic robotaxi demo, let's put Elon in the backseat and ask the car to drive him to a randomly chosen destination 10 miles away and see how it goes.
But this won't happen. Because FSD is NOT fsd --- and it is blatantly obvious that Elon knows this as well as anyone.
When a broken tail light is considered a safety issue, how is a half-baked "full self-driving" system that's in perpetual beta and has never delivered on Tesla's self imposed deadlines for true self-driving allowed on the road at all?