←back to thread

321 points distantprovince | 1 comments | | HN request time: 0.261s | source
1. creatonez ◴[] No.44631161[source]
It always been rude to tell someone to "just google it". But unsurprisingly, as Google Search got worse and worse over time, it got a lot more rude to do this. A search that may have brought you to useful documentation or high quality tutorials in 2005-2010 Google often just returns SEO garbage now. If you aren't at least checking what the results are, telling someone to google it has become beyond useless.

If LLMs can be fixed to the point where they are as reliable as 2005-2010 Google, maybe you can start blindly pasting output or telling people to "just chatgpt it", and it won't be so useless for the victim anymore. But I'm not convinced the hallucination problem and inability to properly cite sources will be solved anytime soon, given how non-deterministic LLMs are. And it appears it's just creating a brand new SEO spam issue, with Gemini results at the top of the page based on the contents of spammy results.