←back to thread

410 points jjulius | 1 comments | | HN request time: 0s | source
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
frabjoused ◴[] No.41889213[source]
The thing that doesn't make sense is the numbers. If it is dangerous in your anecdotes, why don't the reported numbers show more accidents when FSD is on?

When I did the trial on my Tesla, I also noted these kinds of things and felt like I had to take control.

But at the end of the day, only the numbers matter.

replies(13): >>41889230 #>>41889234 #>>41889239 #>>41889246 #>>41889251 #>>41889279 #>>41889339 #>>41890197 #>>41890367 #>>41890522 #>>41890713 #>>41894050 #>>41895142 #
1. kelnos ◴[] No.41894050[source]
Agree that only the numbers matter, but only if the numbers are comprehensive and useful.

How often does an autonomous driving system get the driver into a dicey situation, but the driver notices the bad behavior, takes control, and avoids a crash? I don't think we have publicly-available data on that at all.

You admit that you ran into some of these sorts of situations during your trial. Those situations are unacceptable. An autonomous driving system should be safer than a human driver, and should not make mistakes that a human driver would not make.

Despite all the YouTube videos out there of people doing unsafe things with Tesla FSD, I expect that most people that use it are pretty responsible, are paying attention, and are ready to take over if they notice FSD doing something wrong. But if people need to do that, it's not a safe, successful autonomous driving system. Safety means everyone can watch TV, mess around on their phone, or even take a nap, and we still end up with a lower crash rate than with human drivers.

The numbers that are available can't tell us if that would be the case. My belief is that we're absolutely not there.