←back to thread

410 points jjulius | 7 comments | | HN request time: 0s | source | bottom
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
modeless ◴[] No.41889518[source]
Tesla jumped the gun on the FSD free trial earlier this year. It was nowhere near good enough at the time. Most people who tried it for the first time probably share your opinion.

That said, there is a night and day difference between FSD 12.3 that you experienced earlier this year and the latest version 12.6. It will still make mistakes from time to time but the improvement is massive and obvious. More importantly, the rate of improvement in the past two months has been much faster than before.

Yesterday I spent an hour in the car over three drives and did not have to turn the steering wheel at all except for parking. That never happened on 12.3. And I don't even have 12.6 yet, this is still 12.5; others report that 12.6 is a noticeable improvement over 12.5. And version 13 is scheduled for release in the next two weeks, and the FSD team has actually hit their last few release milestones.

People are right that it is still not ready yet, but if they think it will stay that way forever they are about to be very surprised. At the current rate of improvement it will be quite good within a year and in two or three I could see it actually reaching the point where it could operate unsupervised.

replies(11): >>41889570 #>>41889593 #>>41890163 #>>41890174 #>>41890177 #>>41890374 #>>41890395 #>>41890547 #>>41893442 #>>41893970 #>>41894426 #
snypher ◴[] No.41890163[source]
So just a few more years of death and injury until they reach a finished product?
replies(4): >>41894278 #>>41894434 #>>41895317 #>>41895493 #
1. londons_explore ◴[] No.41894434[source]
So far, data points to it having far fewer crashes than a human alone. Teslas data shows that, but 3rd party data seems to imply the same.
replies(2): >>41894584 #>>41895126 #
2. rvnx ◴[] No.41894584[source]
It disconnects in case of dangerous situations, so every 33 miles to 77 miles driven (depending on the version), versus 400'000 miles for a human
3. llamaimperative ◴[] No.41895126[source]
Tesla does not release the data required to substantiate such a claim. It simply doesn’t and you’re either lying or being lied to.
replies(1): >>41895194 #
4. londons_explore ◴[] No.41895194[source]
tesla releases this data: https://www.tesla.com/VehicleSafetyReport
replies(3): >>41895375 #>>41896186 #>>41897290 #
5. rainsford ◴[] No.41895375{3}[source]
That data is not an apples to apples comparison unless autopilot is used in exactly the same mix of conditions as human driving. Tesla doesn't share that in the report, but I'd bet it's not equivalent. I personally tend to turn on driving automation features (in my non-Tesla car) in easier conditions and drive myself when anything unusual or complicated is going on, and I'd bet most drivers of Teslas and otherwise do the same.

This is important because I'd bet similar data on the use of standard, non-adaptive cruise control would similarly show it's much safer than human drivers. But of course that would be because people use cruise control most in long-distance highway driving outside of congested areas, where you're least likely to have an accident.

6. llamaimperative ◴[] No.41896186{3}[source]
Per the other comment: no, they don't. This data is not enough to evaluate its safety. This is enough data to mislead people who spend <30 seconds thinking about the question though, so I guess that's something (something == misdirection and dishonesty).

You've been lied to.

7. FireBeyond ◴[] No.41897290{3}[source]
No, it releases enough data to actively mislead you (because there is no way Tesla's data people are unaware of these factors):

The report measures accidents in FSD mode. Qualifiers to FSD mode: the conditions, weather, road, location, traffic all have to meet a certain quality threshold before the system will be enabled (or not disable itself). Compare Sunnyvale on a clear spring day to Pittsburgh December nights.

There's no qualifier to the "comparison": all drivers, all conditions, all weather, all roads, all location, all traffic.

It's not remotely comparable, and Tesla's data people are not that stupid, so it's willfully misleading.

This report does not include fatalities. It also doesn't consider any incident where there was not airbag deployment to be an accident. Sounds potentially reasonable until you consider:

- first gen airbag systems were primitive: collision exceeds threshold, deploy. Currently, vehicle safety systems consider duration of impact, speeds, G-forces, amount of intrusion, angle of collision, and a multitude of other factors before deciding what, if any, systems to fire (seatbelt tensioners, airbags, etc.) So hit something at 30mph with the right variables? Tesla: "this is not an accident".

- Tesla also does not consider "incident was so catastrophic that airbags COULD NOT deploy*" to be an accident, because "airbags didn't deploy". This umbrella could also include egregious, "systems failed to deploy for any reason up to and including poor assembly line quality control", as also not an accident and also "not counted".