I replied to the original tweet too ("what would you do if you were Jack Dorsey?"). I said I'd shut the whole thing down.
I replied to the original tweet too ("what would you do if you were Jack Dorsey?"). I said I'd shut the whole thing down.
Unfortunately, these extremely contradictory subjective images of HN seem to be a consequence of its structure, being non-siloed: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que.... This creates a paradox where precisely because the site is less divisive it feels more divisive—in the sense that it feels to people like it is dominated by their enemies, whoever their enemies may be. That's extremely bad for community, and I don't know what to do about it, other than post a version of this comment every time it comes up.
Thanks for caring about level-headeness, in any case.
I mean, I agree with you that we all have biases and blind spots in our perception. Which means... so do the mods. I comment because I want HN to continue to be a site that people like me want to comment on. The site that "people whose comments dang likes" want to comment on surely looks different.
But I think your explanation of why this is is much too simplistic. The difference seems to be that you aren't being bombarded every day with utterly contradictory extremely strong feelings about how awful it is. If you were, you wouldn't be able to write what you just posted. Your judgment that the perception "isn't symmetric" is wildly out of line with what I encounter here, so one of us must be dealing with an extremely skewed sample. Perhaps you read more HN posts and talk to a wider variety of people about HN than I do. From my perspective, the links below are typical—and there are countless more where these came from. Of course, there are also countless links claiming exactly the opposite, but since you already believe that, they aren't the medicine in this case. I sample that list when responding to commenters who see things this way:
https://news.ycombinator.com/item?id=23729568
https://news.ycombinator.com/item?id=17197581
https://news.ycombinator.com/item?id=23429442
https://news.ycombinator.com/item?id=20438487
https://news.ycombinator.com/item?id=15032682
https://news.ycombinator.com/item?id=19471335
https://news.ycombinator.com/item?id=15937781
https://news.ycombinator.com/item?id=21627676
https://news.ycombinator.com/item?id=15388778
https://news.ycombinator.com/item?id=20956287
https://news.ycombinator.com/item?id=15585780
https://news.ycombinator.com/user?id=BetterThanSlave
https://news.ycombinator.com/user?id=slamdance
https://news.ycombinator.com/item?id=15307915
A sample email, for a change of pace: "It's clear ycombinator is clearly culling right-wing opinions and thoughts. The only opinions allowed to remain on the site are left wing [...] What a fucking joke your site has become."
https://news.ycombinator.com/item?id=20202305
https://news.ycombinator.com/item?id=18664482
https://news.ycombinator.com/item?id=16397133
https://news.ycombinator.com/item?id=15546533
https://news.ycombinator.com/item?id=15752730
https://news.ycombinator.com/item?id=20645202
Just the other day I noticed HN take a heavy hand on removing an article that hit the homepage about a virologist publishing a paper that suggested the only logical explanation for COVID is that it’s manufactured.
There are absolutely topics and perspectives that are not welcomed on HN, as the lead moderator it would be unwise in my opinion to think otherwise (given your biases would be the most threatening to an open forum) and you naturally would have a tough time identifying the absence of a perspective you don’t share.
As an example, I would challenge you to pick five articles that discuss unions that hit the homepage on HN and see what % of threads (and how much of the up vote they accounted for) were inherently anti union. I would also be sure to give only partial credit for threads that added boiler plate sentences saying something along the lines of “while I believe in the value of XYZ” because that’s basically a requirement to take any contrarian (to liberal / Silicon Valley ideology) or conservative view on this site. I can give you a laundry list of topics that will show this trend.
From my (biased) perspective (and from someone outside of the valley reading this site religiously for 13 years) HN is increasingly hostile to certain perspectives (and I’m not talking about social issues here). I don’t care much about it - I just opt out - which is the point.
Why not run a poll about it?
It’s the unwillingness to even allow the topic on HN which intrigued me. Believing what I believe (ie my bias) I was surprised it got as far as it did - because I know that’s not the HN community party line - and then noticed it disappeared within seconds of me seeing it on home page. I was the first to comment on it so it couldn’t have been because the threads descended into a flame war and I was unaware of which other guidelines it could have broken given it was a legit site referencing a legit paper in a legit journal (but taking a perspective the broader Hn community won’t tolerate).
https://news.ycombinator.com/item?id=23725966 (flagged, 18 points, 8 days ago, 4 comments)
https://news.ycombinator.com/item?id=23727763 (10 points, 8 days ago, 3 comments)
[Personal opinion: The evidence is not enough to prove that is was created/improved/selected in a lab. It has a few "lucky" features, but normal coronavirus don't cause pandemics, so we already know it is a "lucky" case.]
18 points by haltingproblem 8 days ago | flag | hide | past | favorite | 4 comments
"
If you're talking about the covid submissions that gus_massa came up with, they were flagged by users. In one case we lessened the penalty and the other looks like one we didn't see. I do think that it's unlikely that HN can have a curious conversation about that theme, much as we might both prefer otherwise. I don't think you can validly draw significant general conclusions from that.
A poll wouldn't convince anybody. It would just reconstitute the same disagreement at a meta level. I wrote about something similar here: https://news.ycombinator.com/item?id=23239793. (Edit: and https://news.ycombinator.com/item?id=23808089 in this thread, as it turns out.)
I'm talking about the skew of the people in the actual threads, and what opinions become acceptable consensus. And that's not the same population. Frankly most of us never see the garbage like that, because you flag it.
But what remains isn't "balanced" just because you find jerks on both sides. It's just as likely that one side is working the refs better.
Now again, I don't expect you to agree. But I see what I see. And I post to make sure that others don't see the same thing.
Here's what I hear: "A winger conspiracy theory goes to the top of the HN front page before being taken down! That means that the voting population of HN is horrifically skewed."
See the difference in perspective? I'm happy you take conspiracy stuff down, I really am. But I'm not happy about the population that pushes it finding a home here, which they clearly have.
If you see a comment complaining about "(Apple|Google|Microsoft) fanboys", that's much the same thing. The only actual information in such a comment is about the commenter—specifically, what they dislike (in this case, (Apple|Google|Microsoft)) and therefore what they notice and assigned greater weight to.
Such commenters routinely produce entirely opposite outputs about the exact same input set. Indeed their complaints are interchangeable except for the direction of bias they're complaining about. This phenomenon is so reliable that I'm not sure I've seen any more reliable phenomenon on HN. There is clearly a deep cognitive bias underlying it. I've done my best to try to explain what that is. I'd be interested in hearing other explanations, but so far most responses seem to deny the phenomenon, which from my perspective can't possibly be correct.
Where I think being a moderator makes a big difference is that we get bombarded with these contradictory complaints every day, often in personally abusive ways. You can't help but notice the contradictions when you're getting bashed for one reason one minute and than bashed for exactly the opposite reason the next. When one side calls you Hitler and the other side calls you Stalin, and each side complains bitterly how you ban everyone they agree with and moderate in the other's favor, the only sane response is to become curious about how the exact same thing can result in such an extreme variance in perception.
I think there is contention right now because moderator decisions are opaque so people come up with their own narratives. Without actual data there is no way to tell what type of bias exists and why so it's easy to make up a personal narrative that is not backed with any actual data.
User flagging is also currently opaque and a similar argument applies. If I have to provide a reason for why I flagged something and will know that my name will be publicly associated with which items I've flagged then I will be much more careful. Right now, flagging anything is consequence free because it is opaque.
You're drawing extremely skewed conclusions about the "voting population" of HN. One of those posts made it to #16 before being flagged, the other did not make the front page at all. It takes only a handful of votes to make the front page (much of the time, anyhow—it's complicated), and #16 is not high—any sensational story can easily get that far before being flagged down (and by the way, it was users who flagged it down, not us).
HN is a large enough population sample that you'll find that scale of upvoters upvoting anything, some of the time. You can't conclude anything significant about the "voting population" of HN from that, and the fact that you're doing so strikes me as an indication of what I'm arguing—that your generalizations about HN are determined by your own ideological priors, just as people with opposite ideological commitments arrive at the opposite generalizations, and by exactly the same mechanism.
I think you'd arrive at the same conclusion that I have, if you were forced as I have been to look at all sides of this under unrelenting personal pressure. I also think that I'd be arriving at your conclusion if I hadn't been forced to have this experience. That's a pessimistic conclusion—it means that rational discussion of this is probably not possible. (I mean that structurally—not a personal swipe but exactly the opposite, and I hope that's clear.)
To make an unflattering comparison, Gab and the like are also non-siloed. But that doesn't stop a site from having an innate level of bias to the userbase. The nature of any forum on the internet is that it will cultivate a certain type of userbase over time. Obviously I'm not the greatest one to speak here because I definitely trend more towards the asshole side of the spectrum but I think it's something to be aware of.
I also don't think that it's possible to have any forum without bias so the data I'm certain will indicate bias but at least it will be transparent and obvious so people can point to actual data to make their case one way or the other. It's hard to improve a situation if there is no data to point to and argue about. Without data people just tell stories about whatever makes the most sense from whatever sparse data they have managed to reverse engineer from personal observations.
Making this mistake would lead to more argument, not less—the opposite of what was intended. It would simply reproduce the same old arguments at a meta level, giving the fire a whole new dimension of fuel to burn. Worse, it would skew more of HN into flamewars and meta fixation on the site itself, which are the two biggest counterfactors to its intended use.
Such lists would be most attractive to the litigious and bureaucratic sort of user, the kind that produces two or more new objections to every answer you give [1]. That's a kind of DoS attack on moderation resources. Since there are always more of them than of us, it's a thundering herd problem too.
This would turn moderation into even more of a double bind [2] and might even make it impossible, since we function on the edge of the impossible already. Worst of all, it would starve HN of time and energy for making the site better—something that unfortunately is happening already. This is a well-known hard problem with systems like this: a minority of the community consumes a majority of the resources. Really we should be spending those making the site better for its intended use by the majority of its users.
So forgive me, but I think publishing a full moderation log would be a mistake. I'll probably be having nightmares about it tonight.
[1] https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
It wasn't. the article explained how they couldn't find a publisher for it. That's why I flagged it, because that's a red flag.
It was also based on the statistical fallacy that Feynman once summed up with "There is a car with license plate GH02B [a sequence with no meaning] outside. What are the odds?!"
I wonder if you've got a plan for the knowledge and information you're accumulating related to the function and moderation of hn, and sites like it. Have you written about your experience moderating hn?
As another commentator has said, being a moderator means you only see a certain side of the equation. Users don't see the amount of abuse or nonsense that gets thrown at moderators because a lot of that is invisible or removed, that's just the unfortunate nature of running a forum. Even worse when it devolves into threats or actions against moderators.
But it also blinds you to the smaller shifts in the userbase because the larger conflicting voices are the main thing you hear. It becomes harder to notice when women feel less comfortable posting here because other posters chase them off. Or when minorities have trouble sharing their experiences because any mention of their race triggers a flamewar.
That ends up cultivating a certain level of bias on the forum where only individuals who either silently agree with or add fuel to the fire rotate in and other users rotate out. I mean I've fought with you before because you had to remove the word 'Black' from a story because it caused some users to lash out at the fact that black people were sharing their story.
You did so in order to stop a flamewar, but why did you need to do so in the first place? If a small subset of users can poison a discussion as a result, then you have a problem with the overall bias of the forum.
In this respect HN is about ten years behind the trend of the rest of the web, culturally; even r/libertarian comments in 2020 are probably less reliably libertarian than comments on not-explicitly-political forums anywhere on the web in 2000.
Any complaint without data to back it up would be thrown in the trash pile.
In any case. It's a worthwhile experiment to try because it can't make your life worse. I can't really imagine anything worse than being compared to Hitler and Stalin especially if all that person is doing is just venting their anger. I'd want to avoid being the target of that anger and I would require mathematical analysis from anyone that claimed to be justifiably angry to show the actual justification for their anger. Without data you will continue to get hate mail that's nothing more than people making up a story to justify their own anger. And you have already noticed the personal narrative angle so I'm not telling you anything new here. The data takes away the "personal" part of the narrative which I think is an improvement.
I think this has significant effects, which I wrote about here: https://news.ycombinator.com/item?id=23308098
Re 23743914: I've banned that account. We didn't see that post at the time. I don't think it's typical for such a comment to get upvoted. From what I see, the vast majority get downvoted and/or flagged and/or moderated.
Well, presuming we're talking about the minervanett.no article, there's also this submission that made it to #2 with 24 votes and 18 comments in the 10 minutes before it was flagged to death:
[flagged] The most logical explanation is that it comes from a laboratory (minervanett.no)
https://news.ycombinator.com/item?id=23738545
http://hnrankings.info/23738545/
Then there was also a version a week earlier that got to #3 in 5 minutes before it was killed:
[flagged] [dead] Norwegian virologists suggest Coronavirus originated in a laboratory (minervanett.no)
https://news.ycombinator.com/item?id=23738264
http://hnrankings.info/23738264/
> You'll find that scale of upvoters upvoting anything, some of the time.
I found that to be a surprisingly high number of votes in a short period of time, likely indicating that there is a substantial population on HN who would have liked to discuss that story but were prevented from doing so by a (presumably) much smaller number of flags. Is it actually common for stories without a broad degree of interest to get that many votes in their first few minutes?
> it was users who flagged it down, not us
Technically, but I'm hoping that you try to review the flagged stories and recover the ones where you disagree with the flagging? Otherwise this would seem to allow a "tyranny of the minority" where a small number of people who want to prevent a discussion from taking place are able to enforce their beliefs on the rest of the larger group. I'm a lot more comfortable with you using your expert judgement than with trusting that flags will always be used appropriately without review.
I haven't written much about my experience moderating HN. Occasionally little blobs squeeze out under pressure. Also, I'm not sure I could describe it very well. It might take more of a novelist's skill to explain the experience. I'd probably just use a lot of words like "surreal" that don't say anything, or come up with metaphors that are good for venting but again, don't really explain much.
What I do want to do is distill the moderation explanations I've posted over the years into a sort of expanded FAQ or commentary. If anyone has noticed how often I post HN Search links to past explanations (which I hope is not too annoying), that's because the explanations have converged over the years, on I'd guess at least a couple dozen different significant issues. Things like how we moderate politics on HN, how it's not ok to insinuate astroturfing, how we handle reposts, and so on.
It's not true that what I'm arguing for or perceiving is based only on extreme comments. I tried to explain this in the very comment you replied to. It's based on massive numbers of comments, some extreme and most not. I probably read more of this forum than anyone, for the simple reason that it's my job. I've also spent thousands of hours working on evaluating it as objectively as I possibly can. That does not mean my perceptions are correct or that I'm immune from bias; au contraire. But it's not nothing either.
Based on feedback I've gotten and posts I see, I don't believe that women feel less comfortable posting here than they used to. I believe there has been a slow trend in a better direction, though not everyone agrees. Race is a harder issue to assess because that issue has flared up so massively in society at large lately that the macro trends simply dominate whatever is specific to HN. We can't expect this site to be immune from that.
Here's one to add to your list from today (now flagged dead) that I thought was particularly surreal in its logic: https://news.ycombinator.com/item?id=23808107. It's short, so I'll just quote it in full:
Hacker News pays close attention to the content on the main page. They purge anything that doesn't trend left leaning or counters the standard left's corporate interests.
For example, I posted a link to Michael Moore's film in which he eviscerates bio-fuels. This post was mysteriously removed. It was also removed the second time I posted it. This was despite the link the "A year wearing shorts to work" as another HN article link at the time continued to exist.
A link to a Micheal Moore (the quintessential 90's leftist documentarian) film "eviscerating bio-fuels" is "mysteriously removed", but a fluff piece about shorts remains: Collect underpants, ?, profit!
There are 2 mods running HN. Responding to people is TAXING - as in its hugely costly. And it has some terrible edge cases which destroy the process:
The costly occasions are when you meet people who are either
a) Angry
b) Rule lawyers
c) malignantly motivated
AT this point their goal is to get attention or apply coercive force on the moderation process.
These guys are an existential threat to the conversational process and one of the win conditions is to get people to turn against the moderators.
Social media is a topic that HN gets wrong so regularly, and without recourse to research or analysis so frequently that I would avoid discussing moderation in general here.
The fact is that if people are arguing in good faith, we can have some amount of peace, and even deal with inadvertent faux pas and ignorance, provided you never reach an eternal september scenario.
But bad faith actors make even this scenario impossible.
I've found that having some amount of time spent to just keep a timeline of notable events on a forum is extremely crucial for any forum that lasts a long time and has political discussions on it.
This should probably be a feature for all subs/forums.
Sure HN isn't as bad as some places on the web, but tech still has its subcultures, blind spots and tribes - not to mention politics.
The biggest strain is mod burn out in many places, or even mods getting influenced by the content they police.
Do you guys have any plans or issues like that?
If we are serious about avoiding echochamber effects, then we shouldn't take voting so seriously. What is wrong with having a polite disagreement? Why should we value popularity?
Have we reached a point where we cannot discuss certain topics as adults? If so, can those individuals not simply choose to opt-out of the discussions?
I didn't read the discussion in question, but I don't understand what stopping a discussion solves. From my perspective discussions should be stopped when they are needlessly toxic, when participants can no longer advance their ideas politely. Humans have limitations, sometimes emotions become too hot. I appreciate when dang closes these types of discussions.
An illegitimate reason to censor would be to remove ideas which cannot be countered, but the reader disagrees with. Some people find disagreements and discussions disturbing. Others enjoy the opportunity to challenge their ideas, as a matter of 'intellectual curiosity'. Some simply relish in the tactics of formulating arguments, regardless of the underlying position.
There's a deeper issue though. Such an analysis would depend on labeling the data accurately in the first place, and opposing sides would never agree on how to label it. Indeed, they would adjust the labels until the analysis produced what they already 'know' to be the right answer—not because of conscious fraud but simply because the situation seems so obvious to them to begin with. As I said above, the only people motivated enough to work on this would be ones who would never accept any result that didn't reproduce what they already know, or feel they know.
Two simpler factors are (1) I'm paid to do it and (2) I have creative freedom, which is important to me. Just remembering those two things reminds me that I'm choosing to do this. That sounds so trivial but psychologically it's a big thing.
I think that might be very interesting, and potentially pretty valuable to people working in a similar position.
> If anyone has noticed how often I post HN Search links to past explanations (which I hope is not too annoying)
Those are not annoying. (although I do not doubt you could introduce me to someone who says they are...)
If I were to start a site for online discussion, I would probably not even try to foster a sense of community in the participants.
Although is quite unhealthy for most people not to belong to a community, the human need for belonging is not so pressing that the average person cannot afford to participate a few hours a week on a site that has no hope of ever providing belongingness.
Plus its HN, so the mission is matched by positive history within the community.
Part of the reason I ask is because the handling of the mental costs of such a job is not something covered in the content/research on moderation. We know that employees at firms get PTSD for example, but that's also from staring at the highest levels of radioactive content. Those people need therapy.
For something much milder (hopefully), what do mods do to make peace with things and not lose their minds?
For people who have NEVER thought of social networks and conversations online I find this site to discuss some of the blander but more game theoretic elements of networks/trust and therefore online conversations:
-----------------
For you guys (HN Mods) I'd bet that you in particular are abreast of stuff.
- I'd ask if you have heard/seen Civil Servant, by Nathan Matias - its a system to do experiments on forums and test the results (see if there is a measurable change on user behavior)
https://natematias.com/ - Civil Servant, Professor Cornell. He probably has an account here
https://civilservant.io/moderation_experiment_r_science_rule...
- Books: Custodians of the internet.
------
Going through some of the papers I have stocked away, sadly in no sane order. I can't say if they are classic papers, you may have better.
- Policy/law Paper: Georgetown law, Regulating Online Content Moderation. https://www.law.georgetown.edu/georgetown-law-journal/wp-con...
- NBER paper on polarization - https://www.nber.org/papers/w23258, I disagreed/was surprised by the conclusion. America centric.
- Homophily and minority-group size explain perception biases in social networks, https://www.nature.com/articles/s41562-019-0677-4
- The spreading of misinformation online: https://www.pnas.org/content/113/3/554.full
- The Uni of Alabama has a reddit research group, - https://arrg.ua.edu/research.html, they have 2 papers. One of which explores the effect of a sudden influx of new users on r/2xchromosomes. https://firstmonday.org/ojs/index.php/fm/article/view/10143/...
-policy: OFCOM (UK) has a policy paper on using AI for moderation https://www.ofcom.org.uk/__data/assets/pdf_file/0028/157249/...
- Algorithmic content moderation: Technical and political challenges in the automation of platform governance - https://journals.sagepub.com/doi/10.1177/2053951719897945
- The Web Centipede: Understanding How Web Communities Influence Each Other Through the Lens of Mainstream and Alternative News Sources
- Community Interaction and Conflict on the Web,
- You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech
Papers I have to read myself,
- Does Transparency in Moderation Really Matter?: User Behavior After Content Removal Explanations on Reddit. https://shagunjhaver.com/files/research/jhaver-2019-transpar...
- Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms: https://journals.sagepub.com/doi/abs/10.1177/146144481877305... (I need to read that paper, but I expect it to be a good foundation of knowledge and examples)
Other stuff:
- The turing institute talked about Moderators being key workers during COVID - https://www.turing.ac.uk/blog/why-content-moderators-should-...
The FullFact check is rather good: https://fullfact.org/health/richard-dearlove-coronavirus-cla...
Maybe it's just impossible to discuss a deeply politicised topic like this usefully here though.
On top of all that the forum is skittish about embracing controversial political topics in general, because many people would prefer to just talk about tech.
Why wouldn't HN users flag some Michael Moore content to oblivion?
What is wrong with this? I'm a far cry from a Silicon Valley liberal, and nothing's wrong with it that I can see. There is no One Forum To Rule Them All, and there shouldn't be. Let a thousand forums bloom their own way.
The one thing I'd like to see is a franchise model based on HN - this place pays just enough attention to civility and topicality to promote good discussion without feeling Orwellian. If only that magic could somehow be replicated.
I hear this repeatedly, and I'm not sure this placement of most American politicians on the same dimension as European politicians (if there's even a single dimension there) works very well. I would certainly agree on fiscal issues, for example, but on social issues, at least compared to the western European country in which I live, I suspect Sanders or other far-left US politicians would be on the far-left here as well.
The typical ones are horrific, with many having been started explicitly for political battle.
The common tactic for something like "virus from a lab" would be to move the goalposts and hope the reader doesn't notice. Breeding coronaviruses in a lab actually happened, with scientific papers published about how genetically engineered cells with both human and bat traits were used to help the bat coronaviruses adapt to growing well in human cells. A typical fact checker tactic would be to purposely confuse that fact with the claim that the virus was created from scratch, modeled in a computer and assembled by a machine. Supposed experts say that this is impossible, and so the fact checker can claim that the fact was proven false... but it was a straw man.
The URL you gave is not quite so directly misdirecting, but still vague and IMHO just wrong. I've looked over the scientific papers, and I think the evidence is clear.
There is a silent upvote majority here who will un-down almost anything that isn't clearly toxic or counterfactual.
So I certainly do appreciate the level-headedness here.
I find that a good rule/compromise was to almost never talk about HN per-se, i.e. to avoid almost all meta discussions (I think that rule would have avoided many of the comments present in your list of links), but I am aware that that sort of thing is less and less easy to accomplish in this day and age.
Later edit: Saw your comment bellow about the rationale of collecting those links, glad that it works for you and that it helps you, sincerely.
I mostly agree with you, there's nothing wrong with it. What's happening is the same thing that goes on everywhere in social media. People projecting their worldviews, opinions, and believing that their opinions are special and The One view that should prevail. That they should have a right to be heard everywhere, that their opinions should carry weight everywhere (regardless of the forum).
It's like we're dealing with a very emotionally & intellectually spoiled generation, brats. The everyone gets a trophy generation. People that can't accept that their opinions don't govern the universe and are not important everywhere; nor are all opinions equally important everywhere. Social media has warped all of that severely. It's almost like it has deluded a mass of people into thinking their opinion broadcast inside of their home/space, has (or should have) the same weight when broadcast outside to the world and that the two should be given the same kind of consideration.
I think this is a very common logic failure. They're unable to mentally separate concepts effectively. In my observation very few people actually invest into thinking, how to think, how to use logic, how to reason. It takes a lot of effort to get good at it.
Critically people need to learn that their opinions do not always matter and are not valuable all the time. It's a concept that the woke, cancel culture generation can't tolerate. I think the US culture needs a hefty dose of this right now: your opinion is not as important as you think it is; your feelings are not that important; feelings are not more important than facts.
It really is the fragile, coddled generation. They can't live with the notion of the lack of their importance. It makes perfect sense though, it's also the hyper narcissistic selfie / influencer generation. It all goes together.
I'm curious if given all that you've shared, you think it's even _possible_ to scale a "healthy" discussion site any larger than HN currently is? It's clear that HN's success is in no small part due to the commitment, passion, and active participation of the few moderators. Contrast that with some of the top comments, which describe how toxic Twitter is, and I wonder if there's some sort of limit to effective moderation, or if we just haven't found more scalable solutions to manage millions of humans talking openly online sans toxicity? cheers
And boy it does get too repetitive sometimes, but it's not all things to all men, it's a US tech forum.
> For me, it exacerbates my long standing worries that abuse of flagging (or the counter-reaction thereto) may be the eventual downfall of HN.
It doesn't really worry me, but topics that I may have wanted to discuss on HN long ago and try to today on some posts, I usually spend more time elsewhere discussing them now and less time on HN doing so because they usually have a very short shelf because of the above and that's fine for me, but maybe not HN.
[1] Don't thee thou me, thee thou thissen, and 'ow tha likes thee thouing.
Also, in the old days, T/V was asymmetric. If A V'd B, B T'd A. These days we use symmetric address, so either A and B T each other or they V each other. (if A are a business advertising, the choice of V or T implicitly segments their market. Some businesses wimp out and advertise in english to avoid making any decision.)
https://www.airbnb.nl/rooms/38836139?source_impression_id=p3...
https://www.vieffetrade.eu/sale/bathroom-accessories/ibb/acr...
Most sites its size are far, far worse, I think.
I personally believe that is due to human nature.
I think that is what dang has observed and is trying to articulate - no matter how smart or rigorous or mathematical you are, you still are human and thus subject to the human condition.
One way that manifests is this persuasion that the Other is winning the war (and that there is a war, for that matter).
I take it as almost axiomatic that a site with Twitter's volume cannot be anything but the cesspool it is.
It's too big for a single person to even begin to read a statistically-significant fraction of the content.
That means moderation is a hilariously-stupid concept at that scale. Any team of moderators large enough to do the job will itself suffer the fragmentation and conflicts that online forums do, and find itself unable to agree on what the policies should be, let alone how they should be adapted in contentious cases (and by definition, you only need moderation in contentious cases).
PS: Thanks for keeping the trains running on time!
That's the point, in my opinion at least.
Above you say:
> This site may feel like a "consensus echo chamber" but in reality it is nothing remotely close to that. I think you may be running into the notice-dislike bias...
If a certain class of articles get flagged by a large number of people who have a strong dislike for the topic, and you as the moderator are ok with it because here at HN with the culture being the way it is you allow it to be removed because it won't generate "intellectually curious discussion", then how can you say that HN is not a "consensus echo chamber" when it comes to these particular topics?
It seems to me there are some very obvious errors in your explanation above. You can run this forum however you want, but being aware and transparent about what topics are and are not allowed seems like a better way to do it than disingenuously explaining away flaws. No person is omniscient, however it may seem that way. Just as you can observe flaws in other people evaluation that they themselves cannot see, is it not possible that you too may have some flaws that you cannot see?
The explanation dang gives above is that it prevents discussion that is not "intellectually curious". This type of discussion occurs on HN on a daily basis, but some topics seem to have an extra layer of moderation filters to go through, presumably because they are nearly guaranteed to create significant disharmony. Which is fine - if optimizing for community harmony takes precedence over free discussion of particularly controversial topics, so be it. I just don't like that combined with a claim that HN is in no way an echo chamber.
But of course, transparency and honesty are simply my personal preferences, and HN can't cater to every individual's personal preferences. Surely there are some people here that would not enjoy having a list of blacklisted topics explicitly published, perhaps because it would give the impression of false equivalency or some other perception like that.
There's nothing necessarily wrong with the flagging itself (well, except in this case I believe a lack of awareness of the content of that documentary likely contributes to ongoing destruction of the earth's ecosystem)...but there is a problem where the moderator of HN claims that what you say above is outright false. Purely a misperception on your part.
Another way I can see it being harmful: a never-discussed here (or anywhere else that I know of) topic that I believe may be a key issue with the growing polarization in the world (in turn increasing danger across multiple dimensions), is that there seems to be certain topics that render the human mind unable to sustain consciousness and rational, unbiased thought. Of course Reddit and Facebook are full of this sort of behavior, but there is no shortage of it here on HN either. If solutions to existential threats like climate change require public consensus (do they not?), and even we here on HN are unable to behave in a conscious, logical manner (or even try), then how do we expect the general public to do so? And if no one here is even willing to consider the potential importance of this idea, then those same people shouldn't be too surprised if people like me (and I'm far from unique in this respect) have about as much respect for them as they have for Trump supporters, and roll our eyes at the low-dimensional thinking behind climate change hysteria. If it was really that important to people as intelligent as HN'ers, they should be willing to think - or at the very least, consider the notion of thinking.
As for how this general phenomenon may be dangerous: as a mental experiment, let's assume that it is a very real phenomenon, that does occur in objective physical shared reality. That's bad enough. But now imagine if one or more powerful entities were able to realize that certain things are virtually guaranteed to sink humans into subconscious, unthinking, tribal, non-cooperative behaviour. Could this knowledge be used for nefarious means, and what might the techniques look like? Now, look around the modern world - do we see any new (in the last decade or so) phenomena that have become quite common that may plausibly be invocations of these techniques, to achieve certain goals? Might that perhaps go a little ways to explain the inconceivably irrational behaviour of people on certain topics?
NateEag makes some good points in the sibling comment. You'd have to create the culture at the level of the moderation team, and that's not easy. The way we approach this work on HN has aspects that reach deep into personal life, in a way that I would not feel comfortable requiring of anybody—nor would it work anyhow. If you tried to build such an organization using any standard corporate approach it would likely be a disaster. But maybe it could be done in a different way, or maybe there is an approach that doesn't resemble how we do it on HN.
Would it be possible with the economics of a startup, where the priority has to be growth and/or monetization? Probably less.
For example, the human nature you're talking about is by far the strongest force on HN, and the scale (though tiny compared to Twitter or Facebook or Reddit) is already beyond what one would suppose possible for a forum like this.
An obvious example is where there's a collective disbelief, or perhaps just avoidance, of the bad side of the gig economy since it improves the upper middle-class lifestyle so much that defending it comes naturally even if it's not grounded in social justice.
Is it possible that I may have some flaws that I cannot see? That is beyond possible, it is certain. The trouble with these arguments though is that operating this place is a lot more complicated than people assume it is, and so they say oversimplified things like "aha, you are suppressing topic X so HN is an echo chamber after all" and I have to try to fill in the information gap before we can have a sensible conversation about what the actual flaws might be. I'm super interested in the flaws—but first we have to be talking about the same world, which unfortunately is already not so easy.
That's mistaken. Users in SV are about 10% of the population here, last I checked, but that was for a very wide definition of SV, and by any measure some chunk would not be "tech liberals", so the number is significantly less than 10%.
This site is far more geographically and culturally distributed than people assume it is. I've written about this in several places; one is https://news.ycombinator.com/item?id=23308098 if anyone is interested.
Mises.org and their lengthy critiques of MMT/UBI are shadowbanned. In my reading of what you've explained, it sounds like this is banned because users are incapable of discussing it in good faith?
For me that is a poor reason, we should strive to be better. Moderators should be able handle it. After all if the goal is intellectual curiosity, but the community can't accept critical articles of a heterodox economic theory...
Issues surrounding the CCP are another divisive topic. These threads are usually invaded by pro-CCP trolls and whataboutists.
I agree that it would be nice to have an admission of topics or domains which moderators feel HN is incapable of discussing. I expect that this would challenge users to discuss these topics civilly.
"Aus so krummem Holze, als woraus der Mensch gemacht ist, kann nichts ganz Gerades gezimmert werden."
(I'm trying to think of how one might translate the T/V lines in "Küss die Hand schöne Frau" into english, and failing...)
I’ve found that clear vivid examples from people are crucial torch lights which can be shared around to give people a snap shot into what mods feel or witness. This then allows the conversation with non mods to progress faster, since this type of story telling is what people are best optimized to consume.
No.
We haven’t really found it before the internet (these problems are endemic to human/sentient nature.)
The internet only makes things industrialized.
There are things you can do, that reduce the number of friction points, thus making it possible to self govern
1) narrow topics/purpose - the closer to an objective science the better.
2) no politics, no religion - as far as possible.
3) topic should not be a static/ largely opinion oriented. More goal driven, with progress milestones easily discussed and queried (lose weight, get healthy, ask artists, learn photoshop.)
4) clear and shareable tests to weed out posers - r/badeconomics, askhistory
5) strong moderation.
6) no to little meta Discussions
7) directed paths for self promotion
8) get lucky and have a topic that attracts polite good faith debaters who can identify and eject bad faith actors (the holy grail.)
Each of these options removes or modulates a source of drama. With enough of them removed, you can still get flame wars, but it will be better than necker having done these before.
I would agree that HN is far too big for moderation alone to save it, though I hadn't quite put that together when I wrote my first post.
I think pg's original guidelines managed to capture enough of a cultural ideal that much of the original culture has been preserved organically by the users themselves (though I'm not qualified to speak to the culture of the early years, or how much it has changed since then).
You and the other mod(s?) have done a great job of being a guiding hand, and of understanding that it's too big for anything other than a loose guiding hand to be relevant, from a moderation perspective. You can remove things that shouldn't be discussed, show egregious repeat offenders the door, and encourage people to behave well and be restrained (in large part by example).
Twitter is so much vaster, and grew so fast, that even a guiding hand and good founding culture could not hope to save it. I suspect the way its design encourages rapid-fire back-and-forth also really hurts the nature of interaction on the site.
Eg.
- For operating systems: only apple is good, the rest is bad.
- For politics: choose a side
Also the "if you are not for me, then you are against me" kind of trope.
That's just how it is, I don't think you can change that.
Some people have made their mind up and are only willing to make you change your mind and not listen to any reason from another point of view.
Then there's the abuse.
Where somebody is no longer debating a topic, but start getting personal. That's almost always a sign of acting in bad faith. Those are the type of posts that have a likely hood of needing some moderating. Once somebody like that pops up, they are likely to have crossed a limit that makes them easier to show up on your radar. Sometimes it is an accident, but other times it is a pattern.
It can be interesting to know why somebody acts like that, but perhaps it is better to not know that. Especially when they are not willing to change their behavior.
I wrote about this a bit here: https://news.ycombinator.com/item?id=23727261. Shirky's famous 2003 essay about internet communities was talking in terms of a few hundred people, and argued that groups can't scale beyond that. HN has scaled far beyond that, and though it is not a group in every sense of that essay, it has retained, let's say, some groupiness. It's not a war of all against all—or at least, not only that.
As we learn more about how to operate it, I'm hoping that we can do more things to encourage positive group dynamics. We shall see. The public tides are very much against it right now, but those can change.
But at the end of the day, in the aggregate, either there is zero slant (by topic) whatsoever, or there is greater than zero. Based on my anecdotal observations over a long period of time, my perception is that there are indeed certain topics that are less welcome than others, and the assurances I've read, while plausible, do not seem adequate. If we were able to see a log of removed topics it may be more reassuring.
I'd rather HN had more freedom of topic discussion at least occasionally as an experiment, and then perhaps we could see if some modifications to guidelines (perhaps just on those threads) could keep things a bit more civilized. If no site is willing to put some effort into finding a workable approach to this problem, it seems reasonable that the world is just going to keep becoming more polarized as people spend more time at sites that are designed from scratch to be information bubbles.