Most active commenters
  • breadwinner(4)
  • insane_dreamer(4)

←back to thread

650 points clcaev | 25 comments | | HN request time: 0.387s | source | bottom
1. breadwinner ◴[] No.45064207[source]
> In the annotated video played for the jury, the vehicle detects a vehicle about 170 feet away. A subsequent frame shows it detecting a pedestrian about 116 feet away. As McGee hurtles closer and closer, the video shows the Tesla planning a path through Angulo’s truck, right where he and his girlfriend were standing behind signs and reflectors highlighting the end of the road.

So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?

replies(4): >>45064526 #>>45065541 #>>45065884 #>>45065936 #
2. skeezyboy ◴[] No.45064526[source]
AI is so unlike anything weve ever seen and its going to revolutionise the world and its literally gna be skynet except it pathfinds like a counterstrike bot just ignore that bit
replies(2): >>45065229 #>>45065754 #
3. xgulfie ◴[] No.45065229[source]
Just 2 more years bro just 2 more years and we'll have self driving cars working trust me bro
replies(2): >>45065473 #>>45070902 #
4. AndrewKemendo ◴[] No.45065473{3}[source]
https://waymo.com/rides/san-francisco/

You can take a Waymo any time of day in SF and they provide 1000s of successful rides daily

replies(2): >>45065631 #>>45065794 #
5. vilhelm_s ◴[] No.45065541[source]
I'm guessing they mean it detected a different vehicle and pedestrian but not the ones it hit. (If it was the victim I don't think they would have said "a".)
6. skeezyboy ◴[] No.45065631{4}[source]
And theyve had to spend how many manhours engineering around shit like the above?
replies(3): >>45065874 #>>45066074 #>>45067269 #
7. floren ◴[] No.45065754[source]
we're still so early!
8. myvoiceismypass ◴[] No.45065794{4}[source]
I suspect OP was mocking Elon for still not delivering on what he said would be released "any day now" like what, a decade ago? The goalposts keep moving. They seem to be way behind (pun intended).
9. wuteva ◴[] No.45065874{5}[source]
Your point being?
10. Veserv ◴[] No.45065884[source]
This bad: https://vimeo.com/1093113127/e1fb6c359c

Not just detect a pedestrian and plan a path through them. Hit a pedestrian and plan a path through them to finish the job.

replies(1): >>45067659 #
11. toast0 ◴[] No.45065936[source]
I suspect it's the dog/pig problem [1]. Many of these systems have no object permanence. If a vehicle was detected at 170 feet, it may not have remained detected as the car got closer, same with the pedestrian. We all should know by now that fixed objects are filtered out by the Tesla system, whether that's stopped vehicles or signs and reflectors; it's actually pretty common for driver assistance features to filter out fixed objects, outside of parking assistance speeds... but most other brands don't have drivers that overtrust the assistance features.

[1] As popularized in the movie The Mitchells vs. the Machines: https://m.youtube.com/watch?v=LaK_8-3pWKk

replies(1): >>45066179 #
12. overfeed ◴[] No.45066074{5}[source]
Not all self-driving vehicles are created equal, Tesla and Waymo are not in the same league.

I'm curious; why does it matter to you how many man-hours Waymo spends on a functional service? Would it be disqualifying if it's "too much" in your estimation?

13. bdamm ◴[] No.45066179[source]
Object permanence is a huge thing, which is why Tesla made a big deal about it being deployed in the stack three or four years ago.
replies(1): >>45066469 #
14. toast0 ◴[] No.45066469{3}[source]
The collision was in 2019, so three or four years ago is a little late.
15. CamperBob2 ◴[] No.45067269{5}[source]
Driving is hard.
16. insane_dreamer ◴[] No.45067659[source]
Woah! And just as bad is the Tesla didn't even detect it had run a kid over. So you're also guilty of a hit-and-run. Hitting a kid running out from behind a car is something you could argue a human might have done as well depending on the circumstances. But the human would not have continued driving on as nothing happened (well not unless they're monster).
replies(1): >>45068115 #
17. breadwinner ◴[] No.45068115{3}[source]
Not just that, a human would have stopped for the school bus as required by law. So the errors are (1) not stopping the car for the school bus, (2) running over the kid, and (3) not detecting that it ran over a kid.
replies(1): >>45068437 #
18. insane_dreamer ◴[] No.45068437{4}[source]
The problem is that this software has to be incredibly fault-tolerant, and with so much complexity that is extremely difficult. A 99.9% accuracy rate isn't good enough because I might kill a child for every 1000 school buses I pass. It's why we still have pilots in planes even if computers do a lot more of the work than before.
replies(2): >>45068834 #>>45070920 #
19. breadwinner ◴[] No.45068834{5}[source]
I have taken lots of rides in Waymo, and it has always been smooth. So it is possible. The problem with Tesla is that its CEO lays down dogmatic rules, such as humans only need eyes to drive, so his cars will only have cameras, no LiDAR. He needs to accept that robot cars cannot drive the way humans do. Robot cars can drive better than humans, but they have to do it in their own way.
replies(2): >>45070862 #>>45070872 #
20. ◴[] No.45070862{6}[source]
21. insane_dreamer ◴[] No.45070872{6}[source]
Taxis are different.

I might be willing to take a robotaxi because if something does happen it's not my fault. Same with a bus or train. But I won't trust FSD on my own car (LIDAR or no LIDAR), except in certain circumstances like highway driving, because if something did happen, I'd be at fault even if it was the FSD that failed.

replies(1): >>45071120 #
22. ◴[] No.45070902{3}[source]
23. ndsipa_pomu ◴[] No.45070920{5}[source]
> A 99.9% accuracy rate isn't good enough because I might kill a child for every 1000 school buses I pass

You don't need a high level of accuracy for that - the rule is to not overtake a school bus, so that would equate to one illegal overtake for every 1000 (stopped) buses encountered. Also, not every illegal bus overtake would necessarily put a child in immediate danger.

replies(1): >>45072104 #
24. breadwinner ◴[] No.45071120{7}[source]
> I'd be at fault even if it was the FSD that failed

Unless you're driving a Mercedes with Drive Pilot [1], in which case Mercedes accepts liability [2]. Drive Pilot is not FSD yet, but presumably as it acquires more capabilities Mercedes will continue their policy of accepting liability.

[1] https://www.mbusa.com/en/owners/manuals/drive-pilot

[2] https://www.roadandtrack.com/news/a39481699/what-happens-if-...

25. insane_dreamer ◴[] No.45072104{6}[source]
You could encounter a stopped bus every day on your commute.

You certainly need that accuracy level for critical events.