Most active commenters

    ←back to thread

    129 points xnx | 13 comments | | HN request time: 0.38s | source | bottom
    1. Workaccount2 ◴[] No.45159284[source]
    Google is really shooting themselves in the foot with AI overviews.

    It's probably the most popular AI on earth by daily queries, and likewise probably an ~8B level model, it means a whole bunch of people equate Google AI to AI overviews.

    replies(6): >>45159506 #>>45159543 #>>45159566 #>>45162173 #>>45163743 #>>45166114 #
    2. ◴[] No.45159506[source]
    3. bee_rider ◴[] No.45159543[source]
    I wonder to what extent that “generate some garbage spam along with every search” has hurt their reputation among the general public.
    replies(1): >>45162497 #
    4. ◴[] No.45159566[source]
    5. whimsicalism ◴[] No.45162173[source]
    yeah a lot of people i speak to i think have developed the impression that modern AI hallucinates more than it does from the dumb search model
    6. GlitchInstitute ◴[] No.45162497[source]
    probably close to zero? people get the anwers for most questions.

    Small-time blogs were dead before AI

    replies(1): >>45164552 #
    7. danpalmer ◴[] No.45163743[source]
    I'm a bit biased, but I find the AI overviews to be basically great. All I want from a search engine is the correct answer, Google's knowledge graph has done that for many queries for a long time, and AI overviews seems like a good next step in that process.

    I've not seen many hallucinations, fact checking is fairly straightforward with the onward links, and it's not like I can take any linked content at face value anyway, I'd still want to fact check when it makes sense even if it wasn't AI written.

    replies(2): >>45165045 #>>45171502 #
    8. ruszki ◴[] No.45164552{3}[source]
    People get an answer, and not the answer. It seems that people are fine with this. Even this article's example answer is false. The author didn't care, and considered it good.
    replies(1): >>45164586 #
    9. sometimes_all ◴[] No.45164586{4}[source]
    I believe that Google has figured out (correctly, IMO) that accuracy doesn't matter to most people 99% of the time, and people will likely do a deeper search for the 1% of the time they do want it.

    If people really wanted the truth and facts, we would not have misinformation spread this widely via social media and other places.

    replies(1): >>45166856 #
    10. squigz ◴[] No.45165045[source]
    I constantly get blatant hallucinations. It particularly likes to take feature requests/suggestions and tell me they're presently possible, when they're not. It's long past the point where I just ignore the AI overviews entirely.
    11. nicbou ◴[] No.45166114[source]
    They're also shooting themselves in the foot by halving the traffic to the websites that provide the training data.
    12. utyop22 ◴[] No.45166856{5}[source]
    People prefer to be lazy and sit and do nothing, than to move around to achieve a goal.

    Is that a good thing? The reality is most humans are becoming more and more intellectually lazy - as a result their cognitive function are in decline. Therefore if something looks right at face value / supports an internal bias - they take it and run with it.

    13. jjani ◴[] No.45171502[source]
    You've been asking things that are verbatim in the first few links. Ask anything that isn't and it will confidently hallucinate.