Most active commenters

    ←back to thread

    410 points jjulius | 16 comments | | HN request time: 0.001s | source | bottom
    Show context
    rootusrootus ◴[] No.41892630[source]
    I'm on my second free FSD trial, just started for me today. Gave it another shot, and it seems largely similar to the last free trial they gave. Fun party trick, surprisingly good, right up until it's not. A hallmark of AI everywhere, is how great it is and just how abruptly and catastrophically it fails occasionally.

    Please, if you're going to try it, keep both hands on the wheel and your foot ready for the brake. When it goes off the rails, it usually does so in surprising ways with little warning and little time to correct. And since it's so good much of the time, you can get lulled into complacence.

    I never really understand the comments from people who think it's the greatest thing ever and makes their drive less stressful. Does the opposite for me. Entertaining but exhausting to supervise.

    replies(5): >>41894715 #>>41896317 #>>41896773 #>>41898129 #>>41898671 #
    darknavi ◴[] No.41894715[source]
    You slowly build a relationship with it and understand where it will fail.

    I drive my 20-30 minute commutes largely with FSD, as well as our 8-10 hour road trips. It works great, but 100% needs to be supervised and is basically just nicer cruise control.

    replies(4): >>41895075 #>>41895464 #>>41895891 #>>41895943 #
    coffeefirst ◴[] No.41895075[source]
    This feels like the most dangerous possible combination (not for you, just to have on the road in large numbers).

    Good enough that the average user will stop paying attention, but not actually good enough to be left alone.

    And when the machine goes to do something lethally dumb, you have 5 seconds to notice and intervene.

    replies(2): >>41895427 #>>41895956 #
    jvolkman ◴[] No.41895427[source]
    This is what Waymo realized a decade ago and what helped define their rollout strategy: https://youtu.be/tiwVMrTLUWg?t=247&si=Twi_fQJC7whg3Oey
    replies(1): >>41895700 #
    1. nh2 ◴[] No.41895700[source]
    This video is great.

    It looks like Wayno really understood the problem.

    It explains concisely why it's a bad idea to roll our incremental progress, how difficult the problem really is, and why you should really throw all sensors you can at it.

    I also appreciate the "we don't know when it's going to be ready" attitude. It shows they have a better understanding of what their task actually is than anybody who claims "next year" every year.

    replies(3): >>41895788 #>>41896208 #>>41904273 #
    2. yborg ◴[] No.41895788[source]
    You don't get a $700B market cap by telling investors "We don't know."
    replies(2): >>41895903 #>>41896777 #
    3. rvnx ◴[] No.41895903[source]
    Ironically, Robotaxis from Waymo are actually working really well. It's a true unsupervised system, very safe, used in production, where the manufacturer takes the full responsibility.

    So the gradual rollout strategy is actually great.

    Tesla wants to do "all or nothing", and ends up with nothing for now (example with Europe, where FSD is sold since 2016 but it is "pending regulatory approval", when actually, the problem is the tech that is not finished yet, sadly).

    It's genuinely a difficult problem to solve, so it's better to do it step-by-step than a "big-bang deploy".

    replies(2): >>41896634 #>>41897819 #
    4. trompetenaccoun ◴[] No.41896208[source]
    All their sensors didn't prevent them from crashing into stationary object. You'd think that would be the absolute easiest to avoid, especially with both radar and lidar on board. Accidents like that show the training data and software will be much more important than number of sensors.

    https://techcrunch.com/2024/06/12/waymo-second-robotaxi-reca...

    replies(1): >>41896467 #
    5. rvnx ◴[] No.41896467[source]
    The issue was fixed, now handling 100'000 trips per week, and all seems to go well in the last 4 months, this is 1.5 million trips.
    replies(2): >>41896970 #>>41896990 #
    6. mattgreenrocks ◴[] No.41896634{3}[source]
    Does Tesla take full responsibility for FSD incidents?

    It seemed like most players in tech a few years ago were using legal shenanigans to dodge liability here, which, to me, indicates a lack of seriousness toward the safety implications.

    replies(1): >>41900980 #
    7. zbentley ◴[] No.41896777[source]
    Not sure how tongue-in-cheek that was, but I think your statement is the heart of the problem. Investment money chases confidence and moonshots rather than backing organizations that pitch a more pragmatic (read: asterisks and unknowns) approach.
    8. trompetenaccoun ◴[] No.41896970{3}[source]
    So they had "better understanding" of the problem as the other user put it, but their software was still flawed and needed fixing. That's my point. This happened two weeks ago btw: https://www.msn.com/en-in/autos/news/waymo-self-driving-car-...

    I don't mean Waymo is bad or unsafe, it's pretty cool. My point is about true automation needing data and intelligence. A lot more data than we currently have, because the problem is in the "edge" cases, the kind of situation the software has never encountered. Waymo is in the lead for now but they have fewer cars on the road, which means less data.

    9. jraby3 ◴[] No.41896990{3}[source]
    Any idea how many accidents and how many fatalities? And how that compares to human drivers?
    10. nh2 ◴[] No.41897819{3}[source]
    > So the gradual rollout strategy is actually great.

    I think you misunderstood, or it's a terminology problem.

    Waymo's point in the video is that in contrast to Tesla, they are _not_ doing gradual rollout of seemingly-working-still-often-catastropically-failing tech.

    See e.g. minute 5:33 -> 6:06. They are stating that they are targeting directly the shown upper curve of safety, and that they are not aiming for the "good enough that the average user will stop paying attention, but not actually good enough to be left alone".

    replies(1): >>41902542 #
    11. valval ◴[] No.41900980{4}[source]
    What does that mean? Tesla’s system isn’t unsupervised, so why would they take responsibility?
    replies(1): >>41902194 #
    12. x3ro ◴[] No.41902194{5}[source]
    I don't know, maybe because they call it "Full Self-Driving"? :)
    replies(2): >>41902539 #>>41907111 #
    13. ◴[] No.41902539{6}[source]
    14. espadrine ◴[] No.41902542{4}[source]
    Terminology.

    Since they targeted very low risk, they did a geographically-segmented rollout, starting with Phoenix, which is one of the easiest places to drive: a lot of photons for visibility, very little rain, wide roads.

    15. friendzis ◴[] No.41904273[source]
    > It looks like Wayno really understood the problem.

    All they needed was one systems safety engineering student

    16. valval ◴[] No.41907111{6}[source]
    Doesn't really matter what they call it. The product name being descriptive of the current product or not is a different topic.

    For what it's worth, I wouldn't care if they called it "Penis Enlarger 9000" if it drove me around like it now does.