←back to thread

693 points jsheard | 1 comments | | HN request time: 0.481s | source
Show context
zozbot234 ◴[] No.45093571[source]
AI makes stuff up, film at 11. It's literally a language model. It's just guessing what word follows another in a text, that's all it does. How's this different from the earlier incidents where that same Google AI would suggest that you should put glue on your pizza or eat rocks as a tasty snack?
replies(3): >>45093579 #>>45093603 #>>45093661 #
anonymars ◴[] No.45093661[source]
What's your point? That it's okay? That it should be normalized?
replies(1): >>45093950 #
zozbot234 ◴[] No.45093950[source]
Maybe if it was normalized, people would no longer trust those "AI overviews" as anything other than silly entertainment.
replies(1): >>45094048 #
1. anonymars ◴[] No.45094048[source]
I understand what you're saying in principle, but empirically society doesn't seem to be able to do this now even excepting AI hallucinations. So in practical terms, given the society we do have, what to do?