I don't like the mob thing either but it's how large group dynamics on the internet work (by default). We try to mitigate it where we can but there's not a lot of knowledge about how to do that.
Are there people whose upvotes count for more than others? Or are these actively suppressed? Either way, it makes it hard to have important/robust conversations when the people seeing them gets suppressed
Re the second bit: there aren't any accounts whose upvotes count for more, but if accounts upvote too many bad* comments and/or get involved in voting rings, we sometimes make their votes not count anymore.
* By "bad" I mean bad relative to HN's intended purpose as defined here: https://news.ycombinator.com/newsguidelines.html. Relative to that, "bad" means snark, flamewar, ideological battle, etc. — all the things that zap intellectual curiosity.
In terms of moderator action: we might downweight ChatGPT topics (for oar against) if they seem repetitive rather than significant new information (https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...). But we don't downweight posts that are critical of YC companies—or rather, we do so less than we would downweight similar threads on other topics. See https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu....
Are you sure there aren't abuses from your portfolio companies managers/employees to flag negative stories? I imagine Sam, for example, knows exactly what he has to do to get ChatGPT criticism guided off the stage.
Edit: for example, do you know what happened with this story? https://news.ycombinator.com/item?id=35245626
This is a very interesting/important topic. This was a new topic. It was really hot in the first hour, and just got smashed off the front page.
Quite sure. That is, there may be managers/employers of $companies trying to flag things, but being a YC portfolio company doesn't make that any easier. And yes I'm sure that Sam can't do that. (I also know that he wouldn't try, but that's a separate point.)
Re the FAQ: it doesn't give a detailed explanation (we can't do that without publishing our code) but it summarizes the factors comprehensively. If you want to know more I need to see a specific link. Speaking of which:
Re https://news.ycombinator.com/item?id=35245626: it was on HN's front page for 4 hours, and at some point was downweighted by a mod. I haven't checked about why, but I think most likely it was just our general approach of downweighting opinion pieces on popular topics. Keep in mind that the LLM tsunami is an insanely popular topic—by far the biggest in years—and if we weren't downweighting follow-ups a la https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que..., it would saturate the front page every day.
Actually we tend to not do that moderation on randomwalker posts (https://news.ycombinator.com/user?id=randomwalker) - because they're basically always excellent. But a certain amount of randomness is inescapable and randomwalker posts do great on HN most lot of the time. If we made the wrong call in this case, so much the worse for us and I'm genuinely sorry.
It makes everyone wonder, was this a 'mistake'? Or was it that once-in-a-rare-occasion that YC chooses to cash in its good reputation to suppress a discussion that will cost its friends? It sounds like all they need to do is ask one mod to take care of it, and it goes away pretty quickly.
Can you link me to that?
> The world is filled with people/organizations who do the right thing almost all the time, but then use that clout to do a bad thing when it really matters.
That's a good point! but it's also an irrefutable charge. In fact, someone who behaved perfectly forever would be no less accusable of this. Btw I'm certainly not saying we behave perfectly—but we do take care to moderate HN less, not more, when YC-related interests are part of a story (https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...). That's for reasons of self-interest as much as anything else. It wouldn't make sense to risk the global optimum for local gains.
> It sounds like all they need to do is ask one mod to take care of it, and it goes away pretty quickly.
People are going to feel like that's happening no matter what we do, but FWIW, we don't do that. We do downweight submissions as part of moderation practices that have been established for years, but a YC person doesn't have any more clout over that than you do, if you happen to email us and ask us to take a look at a particular thread (pro or con). And we always answer questions about what happened when people ask.
Btw if you feel like that randomwalker article is still relevant and can support a discussion of something specific and interesting—that is, not yet-another-generic-AI thread—go ahead and repost it and let me know, and I'll put it in the second-chance pool (https://news.ycombinator.com/pool, explained at https://news.ycombinator.com/item?id=26998308), so it will get a random placement on HN's front page at least for a while (how long depends on how the community reacts).
I can't, but really? Every major announcement from them has been top of page. And I don't disagree with that being the case. ChatGPT is THE story of 2023 tech, and their announcements are important to the tech industry. I just like all the discussions around this hugely important topic to be given the same freedom to succeed.
Thanks for the discussion.
As far as I know that's not accurate, or even close.
We're not playing favorites; all we care about is that the most interesting stories get the front page time, since there are many more submissions than space on the front page.