Amazing that Musk did it first. (Although it was suggested to him as part of an interview a month before release).
These systems are very good at finding obscure references that were overlooked by mere mortals.
(Unfortunately, Reply-Grok may have been successfully partially lobotomized for the long term, now. At the time of writing, if you ask grok.com about the 2020 election it says Biden won and Trump's fraud claims are not substantiated and have no merit. If you @grok in a tweet it now says Trump's claims of fraud have significant merit, when previously it did not. Over the past few days I've seen it place way too much charity in right-wing framings in other instances, as well.)
Is it though?
LLMs are great at answering questions based on information you make available to them, especially if you have the instincts and skill to spot when they are likely to make mistakes and to fact-check key details yourself.
That doesn't mean that using them to build a knowledge base itself is a good idea! We need reliable, verified knowledge bases that LLMs can make use-of.
It feels like we've reached Peak Stupidity but it's clear it can (and likely will) get much worse with AI videos.
Most people who believe bullshit were convinced by something. It might not have been fully rational but there is usually a kernel of something there that triggered that belief. They also probably have heard at least the surface level version of the oppising argument at some point before. Too many debunkers just reiterate the surface argument without engaging with whatever is convincing their opponent. Then when it doesn't land they complain their opponent is brainwashed. Which sometimes might even be true, but sometimes their argument just misses the point of why their opponent believes what they do.
Here's a short list of RW conspiracy theories with real life political consequences:
- Antivax conspiracies
- Barack Obama wasn't born in the United States ("birther" conspiracy)
- Biden / Ukraine conspiracy theory
- The litany of Covid-19 conspiracy theories
- The "deep state" conspiracy theory
- Sarah Palin's "death panels" conspiracy theory
- Sandy Hook was fake
- 2020 Election Fraud
- Trump / Ukraine conspiracy theory
- QAnon
Well, no, it hasn’t. It has debunked some things. It has made some incorrect shit up. But it isn’t historically one of the “biggest debunkers” of anything. Do we only speak hyperbole now?
Fox (and others like it) offer 24/7 propaganda based on fear and anger, repeating lies ad nauseam. It's highly effective -- I've seen the results first-had.
Making ad hominem attacks against "debunkers" doesn't make your case.
And again, trying to change people's minds by telling them what they believe is wrong is a fools errand (99.99% of the time). But it still needs to happen as that misinformation should not go unchallenged.
In theory, using LLMs to summarize knowledge could produce a less biased and more comprehensive output than human-written encyclopedias.
Whether Grokipedia will meet that challenge remains to be seen. But even if it doesn't, there's opportunity for other prospective encyclopedia generators to do so.
For a left example, there are people who theorize that the guy who missed putting a bullet in Trump's brain must've been a false flag operator. Although it must be mentioned that "leftie" conspiracy theories are mostly just on social media, while "right" ones end up being broadcast by congresspeople and senators, probably because they know their side will take them at face value..
Wikipedia is really not ideal for the LLM age where multiple perspectives can be rapidly generated. There are many topics where clusters of justified true beliefs and reasonable arguments may ALL be valid surrounding a certain topic. And no I am not talking about "flat earth" pages or other similar nonsense.
"Biggest" is tough to quantify, but "most significant" and "most effective" is what I meant. I use Twitter way too many hours a day basically every day and have a morbid fixation on diving deep into right and far-right rabbit holes there. (Like, on thousands of occasions.)
Grok is without a doubt the single most important contributor to convincing believers of right-wing conspiracy theories that maybe the theories aren't as sound as they thought. I have seen this play out hundreds of times. Grok often serves as a kind of referee or tiebreaker in threads between right-wing conspiracy theorists and debunkers, and it typically sides overwhelmingly with the debunkers. (Or at least used to.) And it does it in a way that validates the conspiracy theorist's feelings, so it's less likely to trigger a psychological immune system response.
https://www.reddit.com/r/GROKvsMAGA/ contains some examples. These may seem cherry-picked, but they generally aren't. (Might need to look at some older posts now that Elon has put increasingly pressure on the Grok and Grokipedia developers to keep it """anti-woke""".)
When a right-wing conspiracy theorist sees some liberal or leftist call them out for their falsehoods, they respond with insults or otherwise dismiss or ignore it. When daddy Elon's Grok tells them - politely - that what they believe is complete horseshit, they react differently. They often respond to it 3 - 20 times, poking and prodding. Of course, most still come away from it convinced Grok is just compromised by the wokes/Jews/whatever. But some seem to actually eventually accept that, at the least, maybe they got some details wrong. It's a very fascinating sight. I almost never see that reaction when they argue with human interlocutors.
To be clear, it was never perfect. For example, if you word things in just the right way and ask leading questions, then like with any LLM (especially one that needs to respond in under 280 characters) you can often eventually coax it into saying something close to what you want. I have just seen many instances where it cuts through bullshit in a way that a leftist arguing with a Nazi can't really do.
https://en.wikipedia.org/wiki/Charlie_Kirk#Assassination
https://grokipedia.com/page/Charlie_Kirk : Assassination Details and Investigation
This is an active case that has not gone to trial, and the alleged text messages and Discords have not had their forensics cross-examined. Yet Grokipedia is already citing them as fact, not allegation. (What is considered the correct neutral way to report on alleged facts in active cases?)
1. The Iraq war was a plot to steal oil reserves
2. World Economic Forum / IMF intentionally impoverish nations
3. Police across America are systematically hunting and executing Black men (thousands per year), but are protected by racist institutions
4. Trump assassination attempts were false flag operations
5. Big Pharma deliberately hides natural cures for cancer to protect corporate profits
Why? We're not nominating a saint or electing a Pope.
If someone has a certain opinion, they're free to argue it here. There's no need to invent imaginary opinions and pretend to advocate for them when there are so many actual HN users.
https://forward.com/news/467423/adl-may-have-violated-wikipe...
But also the ADL is accusing others of covert campaigns: https://wassermanschultz.house.gov/news/documentsingle.aspx?...
So I am sure this is a thing among corporations/NGOs. Note that I picked the ADL because I happened to know this and not because I am trying to make a point about the ADL's purpose. Also I am not really answering the part about progressives although the ADL is arguably a progressive NGO. I think there are astroturfing campaigns on Wikipedia whether progressive or not.
"Waaahhh! How fucking dare you!"
Kimmel made fun of Trump talking about his ballroom when being asked about Kirk, and the right got offended and mad. Although it's not about feelings, it's more about exploiting a tragedy to advance their goals (in this case getting a critic like Kimmel off the air).
I, a left-leaning person who detests Elon Musk and what he's done to Twitter and who generally trusts and likes Wikipedia, feel no shame or regret in assessing Grokipedia, even if I figured it was just going to be the standard tribalistic garbage (which it indeed turned out to be).
*(The same is true of left-wing conspiracy theories. It's silly to pretend that right-wing conspiracy theorists aren't far more common and don't believe in, on average, far more delusional and obviously false conspiracy theories than left-wingers do, but it's important not to forget they exist. I have dealt with some. They're arguably worse in some ways since they tend to be more intelligent, and so are more able to come up with more plausible rationalizations to contort their minds into pretzels.)
There's a big difference between listening to other perspectives and inventing other perspectives.
Why not let the believers of other perspectives argue for those perspectives? Wouldn't they be the best advocates? And if nobody believes the perspective you've invented, then perhaps it wasn't worth discussing after all.
Again, we're not really lacking in volume of commenters here.
Although he's more populist-left and I'm more establishment-liberal (and so I might find him a bit overly conciliatory with certain conspiracy theorists), Andrew Callaghan of Channel 5/All Gas No Brakes demonstrates a good example of this in the first few minutes of this video: https://youtu.be/QU6S3Cbpk-k?t=38
It's a trite point and I ended up repeating it before seeing your post but this really is very true even if it may not seem like it. On one hand the practice is basically futile. But someone absolutely needs to do it. People need to do it. The ecosystem can't only ever contain the false narratives, because that leads to an even worse situation. "Here's why Holocaust denialism is incorrect and why the 271k number is wrong" is essentially pointless, per Sartre, but it's better for neo-Nazis to be exposed to that rather than "one should never even humor Holocaust denialists".
It's like 50x less of an issue but I deal with so many left-wing conspiracies on a daily basis. I think the right is much worse than the left (on this topic and in general) but quite a lot of the left, or at least the populist left/populist far-left is, to me, its own particular sort of exhaustingly insufferability. I am proudly a left-liberal and not a centrist and never won't be, but I am still at a point where I can no longer tolerate a big sub-faction of the left. (Though I can't tolerate basically any of the right, minus a bit of the anti-Trump center-right.) I am going to lose my mind when I see vast numbers of leftists demand people not vote for the Democratic party presidential candidate in 2028.
While it's difficult to deny Trump was a de facto asset of Putin in many ways, a surprising number of people were almost entering right-wing conspiracy theory territory with their epistemological practices regarding Trump's personal involvement with Putin.
Right-wing conspiracism is orders of magnitude worse and more frequent than left-wing conspiracism, but some people were way too willing to believe some of the more radical Russian collusion speculation despite no evidence.
That's one of the reasons I object to the term. People often use "devil's advocate" to state their opinions while providing plausible deniability in the face of criticism of those opinions. Just be honest, stand behind your stated opinions, and take whatever heat comes from that honesty.
Also, I think it's important to separate "left of center" and "leftist". Liberals and leftists are very different. "Progressive left-liberals" are fans of democracy and freedom and don't like bigotry and authoritarianism and Trump. "Leftists" are often fans of Lenin and Stalin and Pol Pot and killing groups of people who aren't ideologically aligned and instating one-party dictatorships and violently suppressing dissent. In leftist parlance, "leftist" = "Marxist" while "liberal" = "capitalist belonging to the moderate wing of fascism". In the US, politics is best described as not two but four factions: leftists, liberals, rightists, and neo-Nazis. Often neo-Nazis will form coalitions with the rightists to help achieve major goals; historically leftists would form coalitions with the liberals, but this seems to be occurring less and less.
Although leftists will insist the notion is absurd and anti-intellectual, horseshoe theory contains a lot of truth in it.
Humans looking through sources, applying knowledge of print articles and real world experiences to sift through the data, that seems far more valuable.
That's kind of been my impression too. Not that it's terribly biased or anything but just rather boring to read.
That's in contrast to other topics, the nuances of which even seasoned experts could disagree about. Any discussion on that could devolve into the nuances of the topic rather than Grokipedia itself. But it's fair to assume the topmost expert on Tim Bray is Tim Bray, so we should be getting a pretty unbiased review.
As such it could be a useful insight into how Grok and Grokipedia and its owners operate.
> What if I told you a single person, soon to be a trillionaire, would like to replace it with one he controls himself. Why wouldn't that bother you more?
I didn't say anything about Grokipedia. I don't have an opinion on it presently. Couldn't the same argument be applied that he's just an interested party? Grok could be used to edit Wikipedia for that matter in a covert campaign. I think both preventing LLMs and relying on them are problematic but it's probably inevitable and I may already be late to the party because I don't know what percent of edits are done by LLMs on Wikipedia but let's say it's not 0%.
I agree that one catches more flies with honey rather than vinegar, but many times it doesn't matter what you say or how you say it -- they're gonna stick to their guns. A prime example of this is in Jordan Klepper interviews where he asks Trump supporters how they feel about something horrible that Biden did, to which they express their indignation; then he reveals that it was actually Trump and they dismiss it because it "doesn't matter".
The perception of bias in Wikipedia remains, and if LLMs can detect and correct for bias, then Grokipedia seems at least a theoretical win.
I'm happy with at least a set of links for further research on a topic of interest.
I've seen this too and agree. It's surprising how well it accomplishes that referee role today, though I wonder how much of that is just because many right-wingers truly expect Grok to be similarly right-wing to them as Elon appears to intend it to be. It's going to be sad when Elon eventually gets more successful at beating it into better following his ideology.
When grok says something factual that Elon doesn't like, he puts his thumb on the scale and changes how grok responds (see the whole South African white 'genocide' business). So why should we trust that an LLM will objectively detect bias, when the people in charge of training that LLM prefer that it regurgitate their preferred story, rather than what is objectively true?
Generally, no.
With a limited domain of verifiable facts, you could perhaps measure a degree of deviation from fact across different questions, though how you get a distance measure for not just one question but that meaningfully aggregates across multiple is slippery without getting into subjective areas. Constructing a measure of directionality would be even harder to do objectively, too.
No, that isn’t even remotely comparable. One person having total control over the content and tone of every single article is not the same thing as millions of independent contributors. Especially if your complaint is /bias/, which is the subject of this thread.
Yes, agents provocateur are a persistent threat for delegitimizing protests.
An in-depth look at the problem: https://acleddata.com/report/demonstrations-and-political-vi...
The technology behind it doesn't matter. Show me the incentives and I'll tell you the results: Wikipedia is decentralized, Grokipedia has a single owner.
If there's a perception of bias, where is it coming from? It's clearly perception born from extreme political bias of the performers. Addressing that sort of perception by changing the content means increasing bias.
Therefore the only logical route forward to hash out incidences of perceived bias and addressing them to expose them as the bias themselves.
I'm yet to see conservatives bring up a single subject that Wikipedia allegedly silences out of ideology, that is not an obviously false conspiracy theory. In this, Wikipedia may appear to have a left-wing bias, but only because the modern right has gotten so divorced from reality that not relaying their propaganda feels like bias against them.
The problem with nihilism is that it’s wrong.
It's vastly more honest to say, "I'm not sure about this" than "Devil's advocate: blah blah blah". Besides hiding behind the devil, the devil's advocate makes the devil look more confident than he should be.
>There's a big difference between listening to other perspectives and inventing other perspectives.
while there's a big difference, the difference doesn't invalidate thinking through issues and searching for the actual conflicting views. "Devil's advocate" is a common enough term, whats the big deal? Is it the word "devil"? Do you think someone is calling you Satan?
There's a potential for dishonesty, but lack of honesty can also mean just opacity or reticence. Either way, openness and honesty are superior.
I do think that sometimes people say "devil's advocate" when it's their own opinion but an "unpopular" opinion that they may be embarrassed to admit, so they hide behind the devil, pretending they're not the devil themselves.
> "Devil's advocate" is a common enough term, whats the big deal? Is it the word "devil"? Do you think someone is calling you Satan?
No. The issue is not the term. A different term would not help. But the term is instructive about its own usage. In the Catholic Church, nobody wanted to argue against a potential saint, so someone had to be specifically appointed by the Church to argue the other side, a position the arguer didn't necessarily believe. The problem with devil's advocates online is that they're self-appointed for some reason, despite the fact that usually there are already people who sincerely believe that opinion and would argue for it, without the need for a devil's advocate. The Catholic Church canonization process is completely different from online arguments, and there's no need for the special role of the devil's advocate.