←back to thread

410 points jjulius | 2 comments | | HN request time: 0.395s | source
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
frabjoused ◴[] No.41889213[source]
The thing that doesn't make sense is the numbers. If it is dangerous in your anecdotes, why don't the reported numbers show more accidents when FSD is on?

When I did the trial on my Tesla, I also noted these kinds of things and felt like I had to take control.

But at the end of the day, only the numbers matter.

replies(13): >>41889230 #>>41889234 #>>41889239 #>>41889246 #>>41889251 #>>41889279 #>>41889339 #>>41890197 #>>41890367 #>>41890522 #>>41890713 #>>41894050 #>>41895142 #
throwaway562if1 ◴[] No.41890522[source]
AIUI the numbers are for accidents where FSD is in control. Which means if it does a turn into oncoming traffic and the driver yanks the wheel or slams the brakes 500ms before collision, it's not considered a crash during FSD.
replies(2): >>41890770 #>>41890811 #
1. concordDance ◴[] No.41890811[source]
Several people in this thread have been saying this or similar. It's incorrect, from Tesla:

"To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact"

https://www.tesla.com/en_gb/VehicleSafetyReport

Situations which inevitably cause a crash more than 5 seconds later seem like they would be extremely rare.

replies(1): >>41894673 #
2. rvnx ◴[] No.41894673[source]
This is Autopilot, not FSD which is an entirely different product