←back to thread

Tim Bray on Grokipedia

(www.tbray.org)
175 points Bogdanp | 1 comments | | HN request time: 0s | source
Show context
hocuspocus ◴[] No.45777495[source]
I checked a topic I care about, and that I have personally researched because the publicly available information is pretty bad.

The article is even worse than the one on Wikipedia. It follows the same structure but fails to tell a coherent story. It references random people on Reddit (!) that don't even support the point it's trying to make. Not that the information on Reddit is particularly good to begin with, even it it were properly interpreted. It cites Forbes articles parroting pretty insane and unsubstantiated claims, I thought mainstream media was not to be trusted?

In the end it's longer, written in a weird style, and doesn't really bring any value. Asking Grok about about the same topic and instructing it to be succinct yields much better results.

replies(3): >>45777512 #>>45777570 #>>45779378 #
jameslk ◴[] No.45777570[source]
It was just launched? I remember when Wikipedia was pretty useless early on. The concept of using an LLM to take a ton of information and distill it down into encyclopedia form seems promising with iteration and refinement. If they add in an editor step to clean things up, that would likely help a lot (not sure if maybe they already do this)
replies(3): >>45777761 #>>45777871 #>>45779173 #
9dev ◴[] No.45777761[source]
Nothing about that seems promising! The one single thing you want from an Encyclopedia is compressing factual information into high-density overviews. You need to be able to trust the article to be faithful to its sources. Wikipedia mods are super anal about that, and for good reason! Why on earth would we want a technology that’s as good at summarisation as it is at hallucinations to write encyclopaedia entries?? You can never trust it to be faithful with the sources. On Wikipedia, at least there’s lots of people checking on each other. There are no such guardrails for an LLM. You would need to trust a single publisher with a technology that’s allowing them to crank out millions of entries and updates permanently, so fast that you could never detect subtle changes or errors or biases targeted in a specific way—and that doesn’t even account for most people, who never even bother to question an article, let alone check the sources.

If there ever was a tool suited just perfectly for mass manipulation, it’s an LLM-written collection of all human knowledge, controlled by a clever, cynical, and misanthropic asshole with a god complex.

replies(2): >>45777963 #>>45778746 #
1. mixedump ◴[] No.45778746{3}[source]
> If there ever was a tool suited just perfectly for mass manipulation, it’s an LLM-written collection of all human knowledge, controlled by a clever, cynical, and misanthropic asshole with a god complex.

It’s painful to watch how many people (a critical mass) don’t understand this — and how dangerous it is. When you combine that potential, if not likely, outcome with the fact that people are trained or manipulated into an “us vs. them” way of thinking, any sensible discussion point that lies somewhere in between, or any perspective that isn’t “I’m cheering for my own team no matter what,” gets absorbed into that same destructive thought process and style of discourse.

In the end, this leads nowhere — which is extremely dangerous. It creates nothing but “useful idiot”–style implicit compliance, hidden behind a self-perceived sense of “deep thinking” or “seeing the truth that the idiots on the other side just don’t get.” That mindset is the perfect mechanism — one that feeds the perfect enemy: the human ego — to make followers obey and keep following “leaders” who are merely pushing their own interests and agendas, even as people inflict damage on themselves.

This dynamic ties into other psychological mechanisms beyond the ego trap (e.g., the sunk cost fallacy), easily keeping people stuck indefinitely on the same self-destructive path — endangering societies and the future itself.

Maybe, eventually, humanity will figure out how to deal with this — with the overwhelming information overload, the rise of efficient bots, and other powerful, scalable manipulation tools now available to both good and bad actors across governments and the private sector. We are built for survival — but that doesn’t make the situation any less concerning.