←back to thread

410 points jjulius | 7 comments | | HN request time: 0.665s | source | bottom
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
frabjoused ◴[] No.41889213[source]
The thing that doesn't make sense is the numbers. If it is dangerous in your anecdotes, why don't the reported numbers show more accidents when FSD is on?

When I did the trial on my Tesla, I also noted these kinds of things and felt like I had to take control.

But at the end of the day, only the numbers matter.

replies(13): >>41889230 #>>41889234 #>>41889239 #>>41889246 #>>41889251 #>>41889279 #>>41889339 #>>41890197 #>>41890367 #>>41890522 #>>41890713 #>>41894050 #>>41895142 #
1. bastawhiz ◴[] No.41889246[source]
Is Tesla required to report system failures or the vehicle damaging itself? How do we know they're not optimizing for the benchmark (what they're legally required to report)?
replies(2): >>41889358 #>>41890792 #
2. rvnx ◴[] No.41889358[source]
If the question is: “was FSD activated at the time of the accident: yes/no”, they can legally claim no, for example if luckily the FSD disconnects half a second before a dangerous situation (eg: glare obstructing cameras), which may coincide exactly with the times of some accidents.
replies(1): >>41892440 #
3. Uzza ◴[] No.41890792[source]
All manufacturers have for some time been required by regulators to report any accident where an autonomous or partially autonomous system was active within 30 seconds of an accident.
replies(1): >>41892580 #
4. diebeforei485 ◴[] No.41892440[source]
> To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed.

Scroll down to Methodology at https://www.tesla.com/VehicleSafetyReport

replies(1): >>41894536 #
5. bastawhiz ◴[] No.41892580[source]
My question is better rephrased as "what is legally considered an accident that needs to be reported?" If the car scrapes a barricade or curbs it hard but the airbags don't deploy and the car doesn't sense the damage, clearly they don't. There's a wide spectrum of issues up to the point where someone is injured or another car is damaged.
replies(1): >>41894118 #
6. kelnos ◴[] No.41894118{3}[source]
And not to move the goalposts, but I think we should also be tracking any time the human driver feels they need to take control because the autonomous system did something they didn't believe was safe.

That's not a crash (fortunately!), but it is a failure of the autonomous system.

This is hard to track, though, of course: people might take over control for reasons unrelated to safety, or people may misinterpret something that's safe as unsafe. So you can't just track this from a simple "human driver took control".

7. rvnx ◴[] No.41894536{3}[source]
This is for Autopilot, which is the car following system on highways. If you are in cruise control and staying on your lane, not much is supposed to happen.

The FSD numbers are much more hidden.

The general accident rate is 1 per 400’000 miles driven.

FSD has one “critical disengagement” (aka before accident if human or safety braking doesn’t intervene) every 33 miles driven.

It means to reach unsupervised with human quality they would need to improve it 10’000 times in few months. Not saying it is impossible, just highly optimistic. In 10 years we will be there, but in 2 months, sounds a bit overpromising.