←back to thread

650 points clcaev | 1 comments | | HN request time: 0.218s | source
Show context
voidUpdate ◴[] No.45063021[source]
I'm still convinced that it being called "full self driving" is misleading marketing and really needs to stop, since it isn't according to Tesla
replies(7): >>45063088 #>>45063277 #>>45063334 #>>45063570 #>>45063571 #>>45063584 #>>45066589 #
razemio ◴[] No.45063570[source]
Everytime this comes up, I am on the opposite site of this. It is clearly full self driving. It can stop at red lights, cross intersections, make turns, park, drive, change lanes, break and navigate on its own. There are various videos online where FSD managed to drive a route start to finish without a single human override. That's full self driving. It can also crash like humans "can" and that why it needs supervision. In this sense, we as humans are also "full self driving" with a much (?) lower risk of crashing.

Like also everytime let the downvotes rain. If you downvote, it would be nice, if you could tell me where I am wrong. It might change my view on things.

replies(3): >>45063601 #>>45064337 #>>45069893 #
1. Workaccount2 ◴[] No.45064337[source]
>if you could tell me where I am wrong

It needs to have a crash rate equal to or ideally lower than a human driver.

Tesla does not release crash data (wonder why...), has a safety driver with a finger on the kill switch, and only lets select people take rides. Of course according to Elon always-honest-about-timelines Musk, this will all go away Soon(TM) and we will have 1M Robotaxis on the road by December 31st.

Completing a route without intervention doesn't mean much. It needs to complete thousands of routes without intervention.

Keep in mind that Waymos have selective intervention for when they get stuck. Teslas have active intervention to prevent them from mowing down pedestrians.