←back to thread

410 points jjulius | 2 comments | | HN request time: 0s | source
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
frabjoused ◴[] No.41889213[source]
The thing that doesn't make sense is the numbers. If it is dangerous in your anecdotes, why don't the reported numbers show more accidents when FSD is on?

When I did the trial on my Tesla, I also noted these kinds of things and felt like I had to take control.

But at the end of the day, only the numbers matter.

replies(13): >>41889230 #>>41889234 #>>41889239 #>>41889246 #>>41889251 #>>41889279 #>>41889339 #>>41890197 #>>41890367 #>>41890522 #>>41890713 #>>41894050 #>>41895142 #
timabdulla ◴[] No.41889251[source]
> If it is dangerous in your anecdotes, why don't the reported numbers show more accidents when FSD is on?

Even if it is true that the data show that with FSD (not Autopilot) enabled, drivers are in fewer crashes, I would be worried about other confounding factors.

For instance, I would assume that drivers are more likely to engage FSD in situations of lower complexity (less traffic, little construction or other impediments, overall lesser traffic flow control complexity, etc.) I also believe that at least initially, Tesla only released FSD to drivers with high safety scores relative to their total driver base, another obvious confounding factor.

Happy to be proven wrong though if you have a link to a recent study that goes through all of this.

replies(1): >>41890026 #
valval ◴[] No.41890026[source]
Either the system causes less loss of life than a human driver or it doesn’t. The confounding factors don’t matter, as Tesla hasn’t presented a study on the subject. That’s in the future, and all stats that are being gathered right now are just that.
replies(2): >>41890292 #>>41894083 #
kelnos ◴[] No.41894083[source]
No, that's absolutely not how this works. Confounding factors are things that make your data not tell you what you are actually trying to understand. You can't just hand-wave that away, sorry.

Consider: what I expect is actually true based on the data is that Tesla FSD is as safe or safer than the average human driver, but only if the driver is paying attention and is ready to take over in case FSD does something unsafe, even if FSD doesn't warn the driver it needs to disengage.

That's not an autonomous driving system. Which is potentially fine, but the value prop of that system is low to me: I have to pay just as much attention as if I were driving manually, with the added problem that my attention is going to start to wander because the car is doing most of the work, and the longer the car successfully does most of the work, the more I'm going to unconsciously believe I can allow my attention to slip.

I do like current common ADAS features because they hit a good sweet spot: I still need to actively hold onto the wheel and handle initiating lane changes, turns, stopping and starting at traffic lights and stop signs, etc. I look at the ADAS as a sort of "backup" to my own driving, and not as what's primarily in control of the car. In contrast, Tesla FSD wants to be primarily in control of the car, but it's not trustworthy enough to do that without constant supervision.

replies(1): >>41901681 #
valval ◴[] No.41901681[source]
Like I said, the time for studies is in the future. FSD is a product in development and they know which stats they need to track in order to track progress.

You’re arguing for something that: 1. Isn’t under contention and 2. Isn’t rooted in the real world.

You’re right FSD isn’t an autonomous driving system. It’s not meant to be, right now.

replies(1): >>41905612 #
freejazz ◴[] No.41905612[source]
> You’re right FSD isn’t an autonomous driving system

Oh, weird. Are you not aware it's called Full SELF Driving?

replies(1): >>41907254 #
1. valval ◴[] No.41907254[source]
Does the brand name matter? The description should tell you all you need to know when making a purchase decision.
replies(1): >>41907371 #
2. freejazz ◴[] No.41907371[source]
Yes, a company's marketing is absolutely part of the representations the company makes about a product they sell in the context of a product liability lawsuit.