Like also everytime let the downvotes rain. If you downvote, it would be nice, if you could tell me where I am wrong. It might change my view on things.
All this demonstrates is the term “full self driving” is meaningless.
Tesla has a SAE Level 3 [1] product they’re falsely marketing as Level 5; when this case occurred, they were misrepresenting a Level 2 system as Level 4 or 5.
If you want to see true self driving, take a Waymo. Tesla can’t do that. They’ve been lying that they can. That’s gotten people hurt and killed; Tesla should be liable for tens if not hundreds of billions for that liability.
Also "All this demonstrates is the term “full self driving” is meaningless." prooves my point that it is not missleading.
The levels are set at the lowest common denominator. A 1960s hot rod can navigate a straight road with no user input. That doesn’t mean you can trust it to do so.
> Where did Tesla say FSD is SAE Level 5 approved?
They didn’t say that. They said it could do what a Level 5 self-driving car can do.
“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.
‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.) [1]”
> Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading
This is tautology. You’re defining FSD to mean whatever Tesla FSD can do.
[1] https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
FSD cannot “do everything a Level 5 system can.” It can’t even match Waymo’s Level 4 capabilities, because it periodically requires human intervention.
But granting your premise, you’d say it’s a Level 2 or 3 system with some advanced capabilities. (Mercedes has a lane-keeping and -switching product. They’re not constantly losing court cases.)
It’s meaningless because Tesla redefines it at will. The misrepresentation causes the meaninglessness.
It needs to have a crash rate equal to or ideally lower than a human driver.
Tesla does not release crash data (wonder why...), has a safety driver with a finger on the kill switch, and only lets select people take rides. Of course according to Elon always-honest-about-timelines Musk, this will all go away Soon(TM) and we will have 1M Robotaxis on the road by December 31st.
Completing a route without intervention doesn't mean much. It needs to complete thousands of routes without intervention.
Keep in mind that Waymos have selective intervention for when they get stuck. Teslas have active intervention to prevent them from mowing down pedestrians.
There's plenty wrong about the FSD terminology and SAE levels would absolutely be clearer, but I doubt more than a tiny fraction of people are confused as to the target of 'self' in the phrase 'full self driving'.
How many juries and courts have ruled adversely against self-cleaning oven makers?
Tesla has absolutely lied about its software's capabilities. From the lawsuit that went to trial:
“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.
‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.) [1]”
[1] https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
I just disagree that any significant number of people anywhere have thought the 'self' in 'full self driving' refers to the driver.
Not urgently. FSD has time-sensitive intervention requirements. Waymo’s time sensitivities are driven by passenger comfort, not safety.
Are you saying you would sit in a Tesla without paying much attention, in the same way you're sitting next to someone you trust driving the car? Would you go do phone stuff or look for stuff in your bag while your Tesla is driving you?
I mean I guess people are doing that, but with all the reports and stories I hear, it seems to me it's quite tricky, and you better just watch the road.
So I wouldn't really call that fully self driving. It's kind of like an LLM, it does great most of the time, but occasionally it does something disastrous. And therefore a human needs to be there to correct it. If you would let it all go on it's own it's not gonna end well. That's not fully self driving. That's human assisted driving.