←back to thread

650 points clcaev | 3 comments | | HN request time: 0.625s | source
Show context
breadwinner ◴[] No.45064207[source]
> In the annotated video played for the jury, the vehicle detects a vehicle about 170 feet away. A subsequent frame shows it detecting a pedestrian about 116 feet away. As McGee hurtles closer and closer, the video shows the Tesla planning a path through Angulo’s truck, right where he and his girlfriend were standing behind signs and reflectors highlighting the end of the road.

So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?

replies(4): >>45064526 #>>45065541 #>>45065884 #>>45065936 #
1. toast0 ◴[] No.45065936[source]
I suspect it's the dog/pig problem [1]. Many of these systems have no object permanence. If a vehicle was detected at 170 feet, it may not have remained detected as the car got closer, same with the pedestrian. We all should know by now that fixed objects are filtered out by the Tesla system, whether that's stopped vehicles or signs and reflectors; it's actually pretty common for driver assistance features to filter out fixed objects, outside of parking assistance speeds... but most other brands don't have drivers that overtrust the assistance features.

[1] As popularized in the movie The Mitchells vs. the Machines: https://m.youtube.com/watch?v=LaK_8-3pWKk

replies(1): >>45066179 #
2. bdamm ◴[] No.45066179[source]
Object permanence is a huge thing, which is why Tesla made a big deal about it being deployed in the stack three or four years ago.
replies(1): >>45066469 #
3. toast0 ◴[] No.45066469[source]
The collision was in 2019, so three or four years ago is a little late.