Most active commenters
  • meowface(12)
  • LastTrain(6)
  • thrance(6)
  • pstuart(5)
  • lapcat(5)
  • (4)
  • onetimeusename(4)
  • tptacek(3)
  • mensetmanusman(3)
  • physarum_salad(3)

←back to thread

Tim Bray on Grokipedia

(www.tbray.org)
175 points Bogdanp | 85 comments | | HN request time: 0.238s | source | bottom
1. tptacek ◴[] No.45777117[source]
Why give it oxygen?
replies(6): >>45777142 #>>45777160 #>>45777311 #>>45777327 #>>45777329 #>>45777411 #
2. mensetmanusman ◴[] No.45777142[source]
It's great idea to share knowledge bases collected and curated by LLMs.

Amazing that Musk did it first. (Although it was suggested to him as part of an interview a month before release).

These systems are very good at finding obscure references that were overlooked by mere mortals.

replies(2): >>45777161 #>>45777313 #
3. meowface ◴[] No.45777160[source]
To play devil's advocate: Grok has historically actually been one of the biggest debunkers of right-wing misinformation and conspiracy theories on Twitter, contrary to popular conception. Elon keeps trying to tweak its system prompt to make it less effective at that, but Grokipedia was worth an initial look from me out of curiosity. It took me 10 seconds to realize it was ideologically-motivated garbage and significantly more right-biased than Wikipedia is left-biased.

(Unfortunately, Reply-Grok may have been successfully partially lobotomized for the long term, now. At the time of writing, if you ask grok.com about the 2020 election it says Biden won and Trump's fraud claims are not substantiated and have no merit. If you @grok in a tweet it now says Trump's claims of fraud have significant merit, when previously it did not. Over the past few days I've seen it place way too much charity in right-wing framings in other instances, as well.)

replies(4): >>45777225 #>>45777240 #>>45777294 #>>45777386 #
4. simonw ◴[] No.45777161[source]
"It's great idea to share knowledge bases collected and curated by LLMs"

Is it though?

LLMs are great at answering questions based on information you make available to them, especially if you have the instincts and skill to spot when they are likely to make mistakes and to fact-check key details yourself.

That doesn't mean that using them to build a knowledge base itself is a good idea! We need reliable, verified knowledge bases that LLMs can make use-of.

replies(1): >>45777482 #
5. pstuart ◴[] No.45777225[source]
The problem of debunking right-wing misinformation is that it doesn't seem to matter. The consumers of that misinformation want it and those of us who think it's bad for society already know that its garbage.

It feels like we've reached Peak Stupidity but it's clear it can (and likely will) get much worse with AI videos.

replies(6): >>45777259 #>>45777318 #>>45777362 #>>45777581 #>>45778665 #>>45784645 #
6. tptacek ◴[] No.45777240[source]
Wikipedia is probably in the running for one of the greatest contributions to public knowledge of the past 100 years, and that's a consequence of how it functions, warts and all. I don't care how good Grok is or isn't. I'm a fan of frontier model LLMs. They don't meaningfully replace Wikipedia.
replies(3): >>45777432 #>>45777452 #>>45778125 #
7. jayd16 ◴[] No.45777294[source]
It's not controlled by a trusted actor so it doesn't matter how it happens to act at the moment.

They could pull the rug at any future time and its almost better to gain trust now and cash in that trust later.

replies(2): >>45777408 #>>45777438 #
8. tshaddox ◴[] No.45777311[source]
Same reason you posted that comment: it's sometimes interesting to discuss a thing even if you dislike the thing.
replies(1): >>45777484 #
9. jayd16 ◴[] No.45777313[source]
> collected and curated by LLMs.

Wah? LLMs don't collect things.

I mean, if any of these AI companies want to open up all their training data as a searchable archive, I'd be all for it.

10. J_McQuade ◴[] No.45777315{4}[source]
Name one.
replies(4): >>45777423 #>>45777447 #>>45777483 #>>45779423 #
11. bawolff ◴[] No.45777318{3}[source]
I think there is a problem sometimes that "debunkers" are often more interested in scoring points with secondary audiences (i.e. people who already agree with them) than actually convincing the people who believe the misinformation.

Most people who believe bullshit were convinced by something. It might not have been fully rational but there is usually a kernel of something there that triggered that belief. They also probably have heard at least the surface level version of the oppising argument at some point before. Too many debunkers just reiterate the surface argument without engaging with whatever is convincing their opponent. Then when it doesn't land they complain their opponent is brainwashed. Which sometimes might even be true, but sometimes their argument just misses the point of why their opponent believes what they do.

replies(3): >>45777407 #>>45777444 #>>45777613 #
12. LastTrain ◴[] No.45777334[source]
Proof? More than a couple anecdotes please.
replies(3): >>45777354 #>>45777360 #>>45777488 #
13. jsheard ◴[] No.45777343[source]
Grokipedia only seems to solve astroturfing by ramping gatekeeping up to 11, not allowing anyone outside of xAI to directly influence content or policy, or even observe the decision making processes that go into it. It stands to reason that you can't astroturf a brick wall.
replies(2): >>45777370 #>>45777481 #
14. TheBlight ◴[] No.45777354{3}[source]
Although I'm sure it's been a blast, we don't need to play by your rules any more.
replies(1): >>45777675 #
15. pureagave ◴[] No.45777360{3}[source]
How many more than a couple do you need? 20 anecdotes? 40 anecdotes? 100? How much bias is okay for you and the world?
replies(1): >>45777450 #
16. Freedom2 ◴[] No.45777362{3}[source]
One of the rallying cries of the right is "facts don't care about your feelings", but it's interesting how the facts either get distorted or ignored.
replies(1): >>45777492 #
17. the_gastropod ◴[] No.45777367{4}[source]
Please. What common conspiracy theories that have actual political impact have water carried by left-wing politicians?

Here's a short list of RW conspiracy theories with real life political consequences:

- Antivax conspiracies

- Barack Obama wasn't born in the United States ("birther" conspiracy)

- Biden / Ukraine conspiracy theory

- The litany of Covid-19 conspiracy theories

- The "deep state" conspiracy theory

- Sarah Palin's "death panels" conspiracy theory

- Sandy Hook was fake

- 2020 Election Fraud

- Trump / Ukraine conspiracy theory

- QAnon

replies(2): >>45777501 #>>45777702 #
18. ◴[] No.45777370{3}[source]
19. LastTrain ◴[] No.45777386[source]
“ Grok has historically actually been one of the biggest debunkers of right-wing misinformation and conspiracy theories on Twitter”

Well, no, it hasn’t. It has debunked some things. It has made some incorrect shit up. But it isn’t historically one of the “biggest debunkers” of anything. Do we only speak hyperbole now?

replies(1): >>45777463 #
20. pstuart ◴[] No.45777407{4}[source]
"You cannot reason a person out of a position he did not reason himself into in the first place."

Fox (and others like it) offer 24/7 propaganda based on fear and anger, repeating lies ad nauseam. It's highly effective -- I've seen the results first-had.

Making ad hominem attacks against "debunkers" doesn't make your case.

And again, trying to change people's minds by telling them what they believe is wrong is a fools errand (99.99% of the time). But it still needs to happen as that misinformation should not go unchallenged.

replies(1): >>45777662 #
21. kvirani ◴[] No.45777408{3}[source]
And the idea of it being controlled by any one entity makes it less interesting and less "good" when compared to Wikipedia
22. bebb ◴[] No.45777411[source]
Because it's a genuinely good idea, and hopefully one for which the execution will be improved upon over time.

In theory, using LLMs to summarize knowledge could produce a less biased and more comprehensive output than human-written encyclopedias.

Whether Grokipedia will meet that challenge remains to be seen. But even if it doesn't, there's opportunity for other prospective encyclopedia generators to do so.

replies(2): >>45777827 #>>45777850 #
23. pstuart ◴[] No.45777423{5}[source]
I think their goto is "Russiagate" but that's because the refuse to acknowledge the facts that Mueller did have evidence but assumed that Congress would act upon it.
replies(1): >>45777750 #
24. meowface ◴[] No.45777432{3}[source]
I fully agree. Even assuming no forced ideological bias from Elon, I doubt it would be nearly as good. I still thought it could be an interesting concept, even if I had very low hopes from the start.
25. meowface ◴[] No.45777438{3}[source]
My expectations were extremely low, as were, and are, my expectations of Grok in general. Was just making an actual devil's advocate case.
replies(1): >>45777486 #
26. ◴[] No.45777444{4}[source]
27. netsharc ◴[] No.45777447{5}[source]
I'm not the grandfather commenter, I'm very much a leftist, but left, right, there's too many whose emotions or tribalism override their logic and make them deny what they see/come up with dumbass theories.

For a left example, there are people who theorize that the guy who missed putting a bullet in Trump's brain must've been a false flag operator. Although it must be mentioned that "leftie" conspiracy theories are mostly just on social media, while "right" ones end up being broadcast by congresspeople and senators, probably because they know their side will take them at face value..

28. LastTrain ◴[] No.45777450{4}[source]
I dunno. Give us at least some? Start with just one really mind blowing one. The implication is there is a big cabal imposing Wikipedia on the world. Given that, it should be easy enough to throw out some concrete examples of major impact.
29. physarum_salad ◴[] No.45777452{3}[source]
"Warts and all" says it all really. What are those warts? Who's responsibility are they?

Wikipedia is really not ideal for the LLM age where multiple perspectives can be rapidly generated. There are many topics where clusters of justified true beliefs and reasonable arguments may ALL be valid surrounding a certain topic. And no I am not talking about "flat earth" pages or other similar nonsense.

replies(1): >>45781083 #
30. meowface ◴[] No.45777463{3}[source]
I am not using hyperbole or speculating. I absolutely mean it.

"Biggest" is tough to quantify, but "most significant" and "most effective" is what I meant. I use Twitter way too many hours a day basically every day and have a morbid fixation on diving deep into right and far-right rabbit holes there. (Like, on thousands of occasions.)

Grok is without a doubt the single most important contributor to convincing believers of right-wing conspiracy theories that maybe the theories aren't as sound as they thought. I have seen this play out hundreds of times. Grok often serves as a kind of referee or tiebreaker in threads between right-wing conspiracy theorists and debunkers, and it typically sides overwhelmingly with the debunkers. (Or at least used to.) And it does it in a way that validates the conspiracy theorist's feelings, so it's less likely to trigger a psychological immune system response.

https://www.reddit.com/r/GROKvsMAGA/ contains some examples. These may seem cherry-picked, but they generally aren't. (Might need to look at some older posts now that Elon has put increasingly pressure on the Grok and Grokipedia developers to keep it """anti-woke""".)

When a right-wing conspiracy theorist sees some liberal or leftist call them out for their falsehoods, they respond with insults or otherwise dismiss or ignore it. When daddy Elon's Grok tells them - politely - that what they believe is complete horseshit, they react differently. They often respond to it 3 - 20 times, poking and prodding. Of course, most still come away from it convinced Grok is just compromised by the wokes/Jews/whatever. But some seem to actually eventually accept that, at the least, maybe they got some details wrong. It's a very fascinating sight. I almost never see that reaction when they argue with human interlocutors.

To be clear, it was never perfect. For example, if you word things in just the right way and ask leading questions, then like with any LLM (especially one that needs to respond in under 280 characters) you can often eventually coax it into saying something close to what you want. I have just seen many instances where it cuts through bullshit in a way that a leftist arguing with a Nazi can't really do.

replies(2): >>45778419 #>>45784637 #
31. TheBlight ◴[] No.45777481{3}[source]
So by your logic, there's only room for one gatekept source of general purpose information on the web? No one besides entrenched progressive interests can do anything to Wikipedia. It's like reddit. We're tired of your guys' spaces. So sorry if this offends.
32. smcin ◴[] No.45777482{3}[source]
Crucial to distinguish between knowledge, fact, claim and allegation. Compare:

https://en.wikipedia.org/wiki/Charlie_Kirk#Assassination

https://grokipedia.com/page/Charlie_Kirk : Assassination Details and Investigation

This is an active case that has not gone to trial, and the alleged text messages and Discords have not had their forensics cross-examined. Yet Grokipedia is already citing them as fact, not allegation. (What is considered the correct neutral way to report on alleged facts in active cases?)

33. virissimo ◴[] No.45777483{5}[source]
Some left-coded popular conspiracies:

1. The Iraq war was a plot to steal oil reserves

2. World Economic Forum / IMF intentionally impoverish nations

3. Police across America are systematically hunting and executing Black men (thousands per year), but are protected by racist institutions

4. Trump assassination attempts were false flag operations

5. Big Pharma deliberately hides natural cures for cancer to protect corporate profits

replies(2): >>45777522 #>>45787896 #
34. tptacek ◴[] No.45777484[source]
I'm fine with the logic of discussing it here but can't fathom why Tim Bray thought this would be a useful post given his own objectives.
replies(2): >>45777937 #>>45777976 #
35. lapcat ◴[] No.45777486{4}[source]
> Was just making an actual devil's advocate case.

Why? We're not nominating a saint or electing a Pope.

If someone has a certain opinion, they're free to argue it here. There's no need to invent imaginary opinions and pretend to advocate for them when there are so many actual HN users.

replies(1): >>45777535 #
36. onetimeusename ◴[] No.45777488{3}[source]
The ADL was caught in a campaign making edits. I remember more details in the past but I simply can't find them now with any search engine.

https://forward.com/news/467423/adl-may-have-violated-wikipe...

But also the ADL is accusing others of covert campaigns: https://wassermanschultz.house.gov/news/documentsingle.aspx?...

So I am sure this is a thing among corporations/NGOs. Note that I picked the ADL because I happened to know this and not because I am trying to make a point about the ADL's purpose. Also I am not really answering the part about progressives although the ADL is arguably a progressive NGO. I think there are astroturfing campaigns on Wikipedia whether progressive or not.

replies(1): >>45777659 #
37. netsharc ◴[] No.45777492{4}[source]
"Charlie Kirk..."

"Waaahhh! How fucking dare you!"

Kimmel made fun of Trump talking about his ballroom when being asked about Kirk, and the right got offended and mad. Although it's not about feelings, it's more about exploiting a tragedy to advance their goals (in this case getting a critic like Kimmel off the air).

38. johncolanduoni ◴[] No.45777501{5}[source]
I won’t go to bat for anything near a full equivocation in contemporary politics, but it’s worth remembering antivax was heavily left-coded prior to Covid. I don’t think approximately anyone has actually good epistemology - just biases that fluctuate in how much they affect the real world. Left wing academics and outlets carrying water for people like Pol Pot in the late 20th century because they liked the idea of communism was a particularly bad one.
replies(1): >>45777801 #
39. ◴[] No.45777522{6}[source]
40. meowface ◴[] No.45777535{5}[source]
We're discussing the central sources of knowledge on the internet and by extension pretty much the epistemological backbones of present human civilization. It's worth being open to other perspectives.

I, a left-leaning person who detests Elon Musk and what he's done to Twitter and who generally trusts and likes Wikipedia, feel no shame or regret in assessing Grokipedia, even if I figured it was just going to be the standard tribalistic garbage (which it indeed turned out to be).

replies(1): >>45777582 #
41. meowface ◴[] No.45777581{3}[source]
On one hand, yes, you're completely right.* On the other hand, there is an obligation for something or someone to do the job of pointing out the info is wrong, and how and why. Even if it makes most of them believe it even more strongly afterwards, it's still worse for it to go constantly unchallenged and for believers to never even come across the opposition.

*(The same is true of left-wing conspiracy theories. It's silly to pretend that right-wing conspiracy theorists aren't far more common and don't believe in, on average, far more delusional and obviously false conspiracy theories than left-wingers do, but it's important not to forget they exist. I have dealt with some. They're arguably worse in some ways since they tend to be more intelligent, and so are more able to come up with more plausible rationalizations to contort their minds into pretzels.)

42. lapcat ◴[] No.45777582{6}[source]
> It's worth being open to other perspectives.

There's a big difference between listening to other perspectives and inventing other perspectives.

Why not let the believers of other perspectives argue for those perspectives? Wouldn't they be the best advocates? And if nobody believes the perspective you've invented, then perhaps it wasn't worth discussing after all.

Again, we're not really lacking in volume of commenters here.

replies(1): >>45777637 #
43. meowface ◴[] No.45777613{4}[source]
This is very, very true. The best debunkers avoid being hostile and make the other side feel like they're being heard and that their feelings and fears are being validated. And they do it in a way that feels honest and not condescending and patronizing (like talking to a child). They make frequent (sincere) concessions and hedges and find as much common ground as they can.

Although he's more populist-left and I'm more establishment-liberal (and so I might find him a bit overly conciliatory with certain conspiracy theorists), Andrew Callaghan of Channel 5/All Gas No Brakes demonstrates a good example of this in the first few minutes of this video: https://youtu.be/QU6S3Cbpk-k?t=38

replies(1): >>45778252 #
44. meowface ◴[] No.45777637{7}[source]
Maybe "devil's advocate" was the wrong term for me to use. In this thread I am sharing only my honest beliefs and perspectives and was referring to the genuine initial willingness I had to show charitability to the concept of Grokipedia before its release.
replies(1): >>45777780 #
45. LastTrain ◴[] No.45777659{4}[source]
That's how Wikipedia works. People can edit it. People who are members of organizations can edit it. The edits are transparent, and the history is preserved. It is open to anyone. It is like you're saying the whole world is biased and stacked against your point of view. The example you provide doesn't suggest any kind of centralized control or gatekeeping at all. Just some interested parties trying vying to contribute to articles that are of interest to them. What if I told you a single person, soon to be a trillionaire, would like to replace it with one he controls himself. Why wouldn't that bother you more? Honestly perplexed.
replies(1): >>45778005 #
46. meowface ◴[] No.45777662{5}[source]
>And again, trying to change people's minds by telling them what they believe is wrong is a fools errand (99.99% of the time). But it still needs to happen as that misinformation should not go unchallenged.

It's a trite point and I ended up repeating it before seeing your post but this really is very true even if it may not seem like it. On one hand the practice is basically futile. But someone absolutely needs to do it. People need to do it. The ecosystem can't only ever contain the false narratives, because that leads to an even worse situation. "Here's why Holocaust denialism is incorrect and why the 271k number is wrong" is essentially pointless, per Sartre, but it's better for neo-Nazis to be exposed to that rather than "one should never even humor Holocaust denialists".

47. LastTrain ◴[] No.45777675{4}[source]
OK
48. meowface ◴[] No.45777702{5}[source]
By left-wing politicians, basically none (while right-wing conspiracy theories are now promoted by tons of right-wing politicians). Among non-politicians: while right-wingers are far more likely to believe in conspiracy theories and the nature of the conspiracy theories they believe are far less tethered to reality on average, conspiracism is still a serious issue on the left.

It's like 50x less of an issue but I deal with so many left-wing conspiracies on a daily basis. I think the right is much worse than the left (on this topic and in general) but quite a lot of the left, or at least the populist left/populist far-left is, to me, its own particular sort of exhaustingly insufferability. I am proudly a left-liberal and not a centrist and never won't be, but I am still at a point where I can no longer tolerate a big sub-faction of the left. (Though I can't tolerate basically any of the right, minus a bit of the anti-Trump center-right.) I am going to lose my mind when I see vast numbers of leftists demand people not vote for the Democratic party presidential candidate in 2028.

49. meowface ◴[] No.45777750{6}[source]
There is some (though in my opinion not much) merit to how right-wingers portray the "Russiagate" thing. The Russian government absolutely did try to interfere in the 2016 US presidential election to help Trump and hurt Clinton, via hacking and releasing emails and via social media influence campaigns, but there was a chunk of the left that from the start seemed to firmly believe Trump was some kind of literal espionage agent of Putin.

While it's difficult to deny Trump was a de facto asset of Putin in many ways, a surprising number of people were almost entering right-wing conspiracy theory territory with their epistemological practices regarding Trump's personal involvement with Putin.

Right-wing conspiracism is orders of magnitude worse and more frequent than left-wing conspiracism, but some people were way too willing to believe some of the more radical Russian collusion speculation despite no evidence.

50. lapcat ◴[] No.45777780{8}[source]
> In this thread I am sharing only my honest beliefs and perspectives

That's one of the reasons I object to the term. People often use "devil's advocate" to state their opinions while providing plausible deniability in the face of criticism of those opinions. Just be honest, stand behind your stated opinions, and take whatever heat comes from that honesty.

replies(1): >>45791234 #
51. meowface ◴[] No.45777801{6}[source]
Even before COVID things were shifting - the antivax part of the left at that time were mostly only sort of aesthetically on the left. I think this Twitter exchange sums up my feelings about that counterargument: https://i.imgur.com/gNXJ6Wl.jpeg

Also, I think it's important to separate "left of center" and "leftist". Liberals and leftists are very different. "Progressive left-liberals" are fans of democracy and freedom and don't like bigotry and authoritarianism and Trump. "Leftists" are often fans of Lenin and Stalin and Pol Pot and killing groups of people who aren't ideologically aligned and instating one-party dictatorships and violently suppressing dissent. In leftist parlance, "leftist" = "Marxist" while "liberal" = "capitalist belonging to the moderate wing of fascism". In the US, politics is best described as not two but four factions: leftists, liberals, rightists, and neo-Nazis. Often neo-Nazis will form coalitions with the rightists to help achieve major goals; historically leftists would form coalitions with the liberals, but this seems to be occurring less and less.

Although leftists will insist the notion is absurd and anti-intellectual, horseshoe theory contains a lot of truth in it.

52. quantified ◴[] No.45777827[source]
Summarizing all the knowledge is very very far from summarizing all that is written. All it takes is including everything published. The earth must be flat. Disease is caused by bad morals. Etc etc.
53. epistasis ◴[] No.45777850[source]
I don't why an LLM would be better in theory. The Wikipedia process is created to manage bias. LLMs are created to repeat the input data, and will therefore be quite biased towards the training data.

Humans looking through sources, applying knowledge of print articles and real world experiences to sift through the data, that seems far more valuable.

replies(1): >>45778289 #
54. tim333 ◴[] No.45777937{3}[source]
I doubt a post saying it was so boring he was unable to finish reading the page about himself is going to bring in many readers.

That's kind of been my impression too. Not that it's terribly biased or anything but just rather boring to read.

55. keeda ◴[] No.45777976{3}[source]
I don't know if this is why, but: he's in a unique position of having an article on himself on Grokipedia, and thus being able and willing to compare it with the reality as he remembers it.

That's in contrast to other topics, the nuances of which even seasoned experts could disagree about. Any discussion on that could devolve into the nuances of the topic rather than Grokipedia itself. But it's fair to assume the topmost expert on Tim Bray is Tim Bray, so we should be getting a pretty unbiased review.

As such it could be a useful insight into how Grok and Grokipedia and its owners operate.

56. onetimeusename ◴[] No.45778005{5}[source]
No. I don't think I am mischaracterizing it and I did not say the whole world is biased against me. I am not the person you replied to in case you're confusing me with them. I gave an example of an astroturfing campaign and yes, the ADL did not disclose what they were doing until they got caught. I don't think that should be casually dismissed as merely just interested parties. I think it is a genuine problem with Wikipedia. I think it violates the spirit of it and I think a paid campaign could subtly influence or overwhelm pages even though it's perfectly within the rules it should be disclosed the edits were done as part of a paid campaign and not a volunteer effort. I did not claim Wikipedia was centralized either. As far as gatekeeping I don't know. I am neither claiming it exists nor denying it.

> What if I told you a single person, soon to be a trillionaire, would like to replace it with one he controls himself. Why wouldn't that bother you more?

I didn't say anything about Grokipedia. I don't have an opinion on it presently. Couldn't the same argument be applied that he's just an interested party? Grok could be used to edit Wikipedia for that matter in a covert campaign. I think both preventing LLMs and relying on them are problematic but it's probably inevitable and I may already be late to the party because I don't know what percent of edits are done by LLMs on Wikipedia but let's say it's not 0%.

replies(1): >>45779275 #
57. onetimeusename ◴[] No.45778125{3}[source]
What percent of edits on Wikipedia do you think are done by LLMs presently? It looks like there is a guide for detecting them https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing . The way Wikipedia functions, LLMs can make edits. They can be detected, but unless you are saying they are useless I don't know what point you are making about an LLM contribution versus a human. That LLMs aren't good enough to make meaningful contributions yet?? That Grok is specifically the problem?
58. pstuart ◴[] No.45778252{5}[source]
I'm a fan of Andrew and am impressed by how he's evolved from documenting stupid kids to actually reporting on issues of interest.

I agree that one catches more flies with honey rather than vinegar, but many times it doesn't matter what you say or how you say it -- they're gonna stick to their guns. A prime example of this is in Jordan Klepper interviews where he asks Trump supporters how they feel about something horrible that Biden did, to which they express their indignation; then he reveals that it was actually Trump and they dismiss it because it "doesn't matter".

59. smitty1e ◴[] No.45778289{3}[source]
> The Wikipedia process is created to manage bias. LLMs are created to repeat the input data, and will therefore be quite biased towards the training data.

The perception of bias in Wikipedia remains, and if LLMs can detect and correct for bias, then Grokipedia seems at least a theoretical win.

I'm happy with at least a set of links for further research on a topic of interest.

replies(4): >>45778503 #>>45778528 #>>45781138 #>>45782587 #
60. AgentME ◴[] No.45778419{4}[source]
> Grok is without a doubt the single most important contributor to convincing believers of right-wing conspiracy theories that maybe the theories aren't as sound as they thought. I have seen this play out hundreds of times. Grok often serves as a kind of referee or tiebreaker in threads between right-wing conspiracy theorists and debunkers, and it typically sides overwhelmingly with the debunkers. (Or at least used to.) And it does it in a way that validates the conspiracy theorist's feelings, so it's less likely to trigger a psychological immune system response.

I've seen this too and agree. It's surprising how well it accomplishes that referee role today, though I wonder how much of that is just because many right-wingers truly expect Grok to be similarly right-wing to them as Elon appears to intend it to be. It's going to be sad when Elon eventually gets more successful at beating it into better following his ideology.

61. apical_dendrite ◴[] No.45778503{4}[source]
Is there some objective standard for what is biased? For many people (including Elon Musk) biased just means something that they disagree with.

When grok says something factual that Elon doesn't like, he puts his thumb on the scale and changes how grok responds (see the whole South African white 'genocide' business). So why should we trust that an LLM will objectively detect bias, when the people in charge of training that LLM prefer that it regurgitate their preferred story, rather than what is objectively true?

replies(1): >>45778526 #
62. dragonwriter ◴[] No.45778526{5}[source]
> Is there some objective standard for what is biased?

Generally, no.

With a limited domain of verifiable facts, you could perhaps measure a degree of deviation from fact across different questions, though how you get a distance measure for not just one question but that meaningfully aggregates across multiple is slippery without getting into subjective areas. Constructing a measure of directionality would be even harder to do objectively, too.

63. ◴[] No.45778528{4}[source]
64. LastTrain ◴[] No.45779275{6}[source]
“Couldn't the same argument be applied that he's just an interested party?”

No, that isn’t even remotely comparable. One person having total control over the content and tone of every single article is not the same thing as millions of independent contributors. Especially if your complaint is /bias/, which is the subject of this thread.

65. onetimeusename ◴[] No.45779423{5}[source]
White supremacists were responsible for the 2020 riots.
replies(1): >>45779649 #
66. pstuart ◴[] No.45779649{6}[source]
You mean Umbrella Man? https://abcnews.go.com/US/man-helped-ignite-george-floyd-rio...

Yes, agents provocateur are a persistent threat for delegitimizing protests.

An in-depth look at the problem: https://acleddata.com/report/demonstrations-and-political-vi...

67. thrance ◴[] No.45781083{4}[source]
Do list these "alternative perspectives" that Wikipedia is allegedly unfairly silencing.
replies(1): >>45781675 #
68. thrance ◴[] No.45781138{4}[source]
I fail to imagine how putting Wikipedia in the hands of an ideologically captured mega-billionaire will help the fight against bias. The owner of Grokipedia has shown times and times again that he has no regards for truth, and likes to advertise the many false things he believes in.

The technology behind it doesn't matter. Show me the incentives and I'll tell you the results: Wikipedia is decentralized, Grokipedia has a single owner.

replies(1): >>45783658 #
69. physarum_salad ◴[] No.45781675{5}[source]
The fact you think there are none is really hilarious!
replies(1): >>45782788 #
70. epistasis ◴[] No.45782587{4}[source]
> The perception of bias in Wikipedia remains,

If there's a perception of bias, where is it coming from? It's clearly perception born from extreme political bias of the performers. Addressing that sort of perception by changing the content means increasing bias.

Therefore the only logical route forward to hash out incidences of perceived bias and addressing them to expose them as the bias themselves.

71. thrance ◴[] No.45782788{6}[source]
The fact you're still eluding isn't very funny though. Please share just one, that we may discuss it.
replies(1): >>45790821 #
72. smitty1e ◴[] No.45783658{5}[source]
To use your terminology, the perception that Wikipedia is "ideologically captured" stands.
replies(1): >>45783800 #
73. thrance ◴[] No.45783800{6}[source]
How so? Because the community collectively refuses to host antivax or climate denialism propaganda? You can find these subjects on there btw, just with a mention correctly labelling them as falsehoods.

I'm yet to see conservatives bring up a single subject that Wikipedia allegedly silences out of ideology, that is not an obviously false conspiracy theory. In this, Wikipedia may appear to have a left-wing bias, but only because the modern right has gotten so divorced from reality that not relaying their propaganda feels like bias against them.

replies(1): >>45792074 #
74. mensetmanusman ◴[] No.45784637{4}[source]
This is true, I’m surprised how well grok and community votes have worked (much better than silencing and shadow banning).
75. mensetmanusman ◴[] No.45784645{3}[source]
“ The problem of debunking right-wing misinformation is that it doesn't seem to matter.”

The problem with nihilism is that it’s wrong.

76. habinero ◴[] No.45787896{6}[source]
I've heard right wing people claim 2 and 5, and I wouldn't call 1 or 4 "popular" by any stretch of the imagination.

3 is just a weirdly-phrased version of a real problem.

77. physarum_salad ◴[] No.45790821{7}[source]
Omission by design or by accident/lack of resources can be found in most pages on wikipedia.
replies(1): >>45793112 #
78. cowboylowrez ◴[] No.45791234{9}[source]
Sure sure, but what happens if someone isn't 100 percent behind their opinions? Initial assessments for instance could very well attempt to see problems or anticipate arguments for or against particular viewpoints.
replies(1): >>45791428 #
79. lapcat ◴[] No.45791428{10}[source]
That's fine, but you should attribute your certainty or uncertainty, as the case may be, to yourself and not to "the devil."

It's vastly more honest to say, "I'm not sure about this" than "Devil's advocate: blah blah blah". Besides hiding behind the devil, the devil's advocate makes the devil look more confident than he should be.

replies(1): >>45791628 #
80. cowboylowrez ◴[] No.45791628{11}[source]
yeah but you're sort of attributing dishonesty to someones post when I don't think it merits it.

>There's a big difference between listening to other perspectives and inventing other perspectives.

while there's a big difference, the difference doesn't invalidate thinking through issues and searching for the actual conflicting views. "Devil's advocate" is a common enough term, whats the big deal? Is it the word "devil"? Do you think someone is calling you Satan?

replies(1): >>45791985 #
81. lapcat ◴[] No.45791985{12}[source]
> yeah but you're sort of attributing dishonesty to someones post when I don't think it merits it.

There's a potential for dishonesty, but lack of honesty can also mean just opacity or reticence. Either way, openness and honesty are superior.

I do think that sometimes people say "devil's advocate" when it's their own opinion but an "unpopular" opinion that they may be embarrassed to admit, so they hide behind the devil, pretending they're not the devil themselves.

> "Devil's advocate" is a common enough term, whats the big deal? Is it the word "devil"? Do you think someone is calling you Satan?

No. The issue is not the term. A different term would not help. But the term is instructive about its own usage. In the Catholic Church, nobody wanted to argue against a potential saint, so someone had to be specifically appointed by the Church to argue the other side, a position the arguer didn't necessarily believe. The problem with devil's advocates online is that they're self-appointed for some reason, despite the fact that usually there are already people who sincerely believe that opinion and would argue for it, without the need for a devil's advocate. The Catholic Church canonization process is completely different from online arguments, and there's no need for the special role of the devil's advocate.

replies(1): >>45792232 #
82. smitty1e ◴[] No.45792074{7}[source]
> "climate denialism propaganda"

Q.E.D.

replies(1): >>45793111 #
83. cowboylowrez ◴[] No.45792232{13}[source]
I actually like the role of devils advocate and can appreciate it. This fondness is not decreased by your assertion that there is no need. I do like your history on the terms origin, but again I don't think it follows that there is "no need" for the role, but maybe the role can exist without the appropriation of the historical term.
84. thrance ◴[] No.45793111{8}[source]
Oh, you don't believe in climate change. Well, there we go. This explains that. Conservative propaganda has made you unable to distinguish truth from obvious lies, hence why you think Wikipedia is so biased. Have you considered your own biases?
replies(1): >>45793869 #
85. thrance ◴[] No.45793112{8}[source]
One example, just one. Please.