←back to thread

410 points jjulius | 9 comments | | HN request time: 0.616s | source | bottom
Show context
rKarpinski ◴[] No.41889014[source]
'Pedestrian' in this context seems pretty misleading

"Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two people got out to help with traffic control. A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. "

edit: Parent article was changed... I was referring to the title of the NPR article.

replies(3): >>41889049 #>>41889056 #>>41889087 #
1. danans ◴[] No.41889087[source]
> Pedestrian' in this context seems pretty misleading

What's misleading? The full quote:

"A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene."

If you exit a vehicle, and are on foot, you are a pedestrian.

I wouldn't expect FSD's object recognition system to treat a human who has just exited a car differently than a human walking across a crosswalk. A human on foot is a human on foot.

However, from the sound of it, the object recognition system didn't even see the 4Runner, much less a person, so perhaps there's a more fundamental problem with it?

Perhaps this is something that lidar or radar, if the car had them, would have helped the OR system to see.

replies(3): >>41889116 #>>41889174 #>>41889724 #
2. jfoster ◴[] No.41889174[source]
The description has me wondering if this was definitely a case where FSD was being used. There have been other cases in the past where drivers had an accident and claimed they were using autopilot when they actually were not.

I don't know for sure, but I would think that the car could detect a collision. I also don't know for sure, but I would think that FSD would stop once a collision has been detected.

replies(4): >>41889223 #>>41889296 #>>41889332 #>>41889576 #
3. bastawhiz ◴[] No.41889223[source]
Did the article say the Tesla didn't stop after the collision?
replies(1): >>41889343 #
4. FireBeyond ◴[] No.41889296[source]
> FSD would stop once a collision has been detected.

Fun fact, at least until very recently, if not even to this moment, AEB (emergency braking) is not a part of FSD.

replies(1): >>41891537 #
5. pell ◴[] No.41889332[source]
> There have been other cases in the past where drivers had an accident and claimed they were using autopilot when they actually were not.

Wouldn’t this be protocoled by the event data recorder?

6. jfoster ◴[] No.41889343{3}[source]
If it hit the vehicle and then hit one of the people who had exited the vehicle with enough force for it to result in a fatality, it sounds like it might not have applied any braking.

Of course, that depends on the speed it was traveling at to begin with.

7. danans ◴[] No.41889576[source]
> There have been other cases in the past where drivers had an accident and claimed they were using autopilot when they actually were not.

If that were the case here, there wouldn't be a government probe, right? It would be a normal "multi car pileup with a fatality" and added to statistics.

With the strong incentive on the part of both the driver and Tesla to lie about this, there should strong regulations around event data recorders [1] for self driving systems, and huge penalties for violating those. A search across that site doesn't return a hit for the word "retention" but it's gotta be expressed in some way there.

1. https://www.ecfr.gov/current/title-49/subtitle-B/chapter-V/p...

8. potato3732842 ◴[] No.41889724[source]
Tesla's were famously poor at detecting partial lane obstructions for a long time. I wonder if that's what happened here.
9. modeless ◴[] No.41891537{3}[source]
I believe AEB can trigger even while FSD is active. Certainly I have seen the forward collision warning trigger during FSD.