Most active commenters
  • ktosobcy(7)
  • plopilop(4)
  • op00to(3)
  • Karrot_Kream(3)

←back to thread

286 points saikatsg | 70 comments | | HN request time: 0.267s | source | bottom
1. ktosobcy ◴[] No.45137736[source]
EU should to the same (FB & X).

In general anything that has "algorytmic content ordering" that pushes content triggering strong emotional reactions should be banned and burned to the ground.

replies(16): >>45137822 #>>45138118 #>>45138157 #>>45138164 #>>45138363 #>>45138553 #>>45138596 #>>45138634 #>>45138850 #>>45139915 #>>45141003 #>>45141109 #>>45141695 #>>45141755 #>>45143915 #>>45147812 #
2. thinkingtoilet ◴[] No.45137822[source]
It's such an obvious poison. Social media is responsible for the destruction of civility on so many levels. It has destroyed a generations attention span. It is a drug that is more powerful and addictive than something like weed. It seems like people here are too young to remember a life before it. It has transformed society negatively in just a decade. It absolutely should go. I'm glad you did something positive on it. Or found a community. You can still do that without social media. It needs to go.
replies(3): >>45138000 #>>45138421 #>>45148447 #
3. ktosobcy ◴[] No.45138000[source]
IMHO there were better communities on old forums...
replies(1): >>45138338 #
4. eviks ◴[] No.45138118[source]
Only with all the censors as kindling!
5. ◴[] No.45138157[source]
6. thinkingtoilet ◴[] No.45138338{3}[source]
And it was contained. If you have a small group, you can manage an asshole or two, sometimes it can even be endearing ("he's an asshole, but he's our asshole"). Once the numbers start going up the toxicity increases by orders of magnitude. It's impossible to moderate. The benefits nearly all fall away and the negatives are amplified. Add on the smartest people in the world working very hard to get everyone, including children, addicted to social media and it's fucking nefarious.
replies(2): >>45138564 #>>45139176 #
7. nradov ◴[] No.45138363[source]
Fortunately the US federal government is standing up for the interests of US tech companies, and for the principle of free speech. They won't let the EU get away with such an extreme authoritarian move.
replies(8): >>45138488 #>>45138661 #>>45138788 #>>45138937 #>>45138965 #>>45139184 #>>45139234 #>>45144482 #
8. daoboy ◴[] No.45138368[source]
I strongly agree that free speech is crucial, but the first part of your statement is in direct opposition to the second.

People stating their perspectives and arguing against other's with complete disregard for civility (or being 'mean' as you said), makes it far more difficult for people to respect opposing viewpoints.

replies(1): >>45138670 #
9. alistairSH ◴[] No.45138447[source]
The problem, to me anyway, is that FB etc don't serve me the opinions of people I know and want to engage with. Instead, they serve me a stream of content specifically tailored to annoy me and get the dopamine hit and make me react.

Of course, my solution was to stop using those services. But, I wouldn't be surprised if certain personality types are unable to do that (same as they can't quit smoking or porn or whatever else).

10. miltonlost ◴[] No.45138488[source]
Lol a content algorithm is not free speech
replies(1): >>45138748 #
11. plopilop ◴[] No.45138553[source]
Sooo... Should we ban Google too? It is also ordering the contents of its research results with algorithms. Similarly, HN and reddit order the contents of their front page with some algorithms, and in the case of Google and Reddit, the algorithm is personalized with the user's preferences.

Or do we only ban websites that design their algorithms to trigger strong emotional emotions? How do you define that? Even Musk doesn't go around saying that the algorithm is modified to promote alt right, instead he pretends it is all about "bringing balance back". Furthermore, I would argue that systems based on votes such as Reddit or HN are much more likely than other systems to push such content. We could issue a regulation to ban specific platforms or websites (TikTok, X...) by naming them individually, but that would probably go against many rules of free competition, and would be quite easily circumvented.

Not that I disagree on the effect of social medias on society, but regulating this is not as easy as "let's ban the algorithm".

replies(1): >>45138601 #
12. threetonesun ◴[] No.45138564{4}[source]
Ah, this reminds me of the one asshole on the old car forum I used to heavily participate in, who would tell new users how dumb all their ideas were for modifying their cars were. And yes, some would argue back, and then someone else would step in and point out all the threads from the cranky asshole where he'd already tried everything they were suggesting.
13. richwater ◴[] No.45138596[source]
[flagged]
replies(2): >>45139663 #>>45143304 #
14. ktosobcy ◴[] No.45138601[source]
ERM, FB itself admited they made a research regarding emotional response to the content they show.

FB/X modus operandi is keep as much people for as long possible glued to the screen. The most triggering content will awaken all those "keyboard wariors" to fight.

So instead of seeing your friends and people you follow on there you would mostly see something that would affect you one way or another (hence proliferation of more and more extreme stuff).

Google is going downhill but for different reasons - they also care only about investors bottomline but being the biggest ad-provider they don't care all that much if people spend time on google.com page or not.

replies(1): >>45139262 #
15. ktosobcy ◴[] No.45138629[source]
How do you learn to "deal and tolerate" people that constantly spit in your face? Especially if you try to avoid them but the platform does everything in its power to stear them your way?

It's basically a dark entity that cranks up footbal hooligans and then push them on the collision course.

There is no civility there.

16. wmeredith ◴[] No.45138634[source]
I saw a really good analogy the other day (on X, natch) that said subscribing to modern social media is like inviting a clown to come in your house every 10 minutes and scream, "It's gotten worse". I think about that a lot. Curation goes a long way, but it takes work.
replies(5): >>45139109 #>>45139153 #>>45139692 #>>45140015 #>>45144032 #
17. ktosobcy ◴[] No.45138661[source]
Can the US and ef-of and keep this civil and social enshitification to itself? The rest of the world would be very happy if the US would finally put the wall around itself and stopped meddling with every darn scrap of the world...
18. spacebanana7 ◴[] No.45138670{3}[source]
On the contrary, I think a meaningful part of the population is incapable of digesting ideas without them being coupled to conflict. You don't need to respect opposing viewpoints in order to engage with them.

For such people everything must be framed in a good versus evil, us vs them or generally sensationalist manner to sustain any kind of attention.

19. krapp ◴[] No.45138748{3}[source]
All software is free speech, end of.

It's insane that the same community that rails against attempts to police encryption, that believes in the ethos of free software, that "piracy isn't theft" and "you can't make math illegal" and that champions crypto/blockchain to prevent censorship is so sympathetic to banning "content ordering algorithms."

The problem is not the algorithms, the problem is the content, and the way people curate that content. Platforms choosing to push harmful content and not police it is a policy issue.

Is the content also free speech? Yes. But like most people I don't subscribe to an absolutist definition of free speech nor do I believe free speech means speech without consequences (absent government censorship) or that it compels a platform.

So I think it's perfectly legitimate for platforms to ban or moderate content even beyond what's strictly legal, and far less dangerous than having governments use their monopoly on violence to control what sorting algorithms you're allowed to use, or to forcibly nationalize and regulate any platform that has over some arbitrary number of users (which is something else a lot of people seem to want.)

We should be very careful about the degree of regulation we want governments to apply to what is in essence the only free mass communications medium in existence. Yes, the narrative is that the internet is entirely centralized and controlled by Google/Facebook/Twitter now but that isn't really true. It would absolutely become true if the government regulated the internet like the FCC regulates over the air broadcasts. Just look at the chaos that age verification laws are creating. Do we really want more of that?

replies(1): >>45140858 #
20. a_ba ◴[] No.45138788[source]
This administration is not standing up for the principles of free speech. It has violated this principle numerous times in action and in spirit.
21. sniffers ◴[] No.45138794[source]
"We should kill/imprison peoples who have (immutable characteristic)" is hardly just a "mean thing people post".

There's mean content ("I think you are an asshole") and there's content that's going to cause actual harm because it either goads others to violence or because it creates a constant cortisol increase from fear and dehumanization.

22. tomp ◴[] No.45138850[source]
Why would "algorithmic" outrage-porn content (X, Meta) be any worse than human-ordered outrage-porn content (news websites)?
replies(1): >>45139007 #
23. myvoiceismypass ◴[] No.45138937[source]
> for the principle of free speech

Indeed. You are free to praise the president or face the consequences. Some freedom.

24. maleldil ◴[] No.45138965[source]
> standing up for the interests of US tech companies

Imagine if they stood up for the interests of citizens instead.

25. ktosobcy ◴[] No.45139007[source]
News websites are regulated…
replies(1): >>45139820 #
26. mrcwinn ◴[] No.45139109[source]
Not to the same degree, but I'd argue HN has the same tendencies. Cynical, skeptical, assuming the worst intentions, a bogeyman tech giant hoping to destroy its own customers. Skepticism is, of course, healthy, but the default behavior in this community completely misses the reality that had we frozen progress, say, right near the Apple II launch, we never get HackerNews itself. :)

And if you accept my premise, it's probably not the websites, but rather the humans themselves.

replies(2): >>45139134 #>>45144064 #
27. hshshshshsh ◴[] No.45139134{3}[source]
Have you worked in a fortune 500?
28. fluoridation ◴[] No.45139153[source]
It just comes down to how you use it. I use Twitter and BlueSky exclusively to follow artists, and all I see is art. If I didn't come to HN, I don't think I'd hear about any news.
replies(1): >>45140920 #
29. diggan ◴[] No.45139176{4}[source]
> Once the numbers start going up the toxicity increases by orders of magnitude. It's impossible to moderate.

As someone who spent an embarrassingly long time on what lots of people claim to be the most toxic forum in the world (not sure about that, it's the biggest in the Nordics though, that's for sure), and even moderated some categories on that forum that many people wouldn't touch with a ten-foot pole, it really isn't that hard to moderate even when the topics are sensitive and most users are assholes.

I'd argue that moderation is difficult today on lots of platforms because it's happening too much "on the fly" so you end up with moderators working with the rules differently and applying them differently, depending on mood/topic/whatever.

If you instead make a hard list of explicit rules, with examples, and also establish internal precedents that moderators can follow, a lot of the hard work around moderation basically disappears, regardless of how divisive the topic is. But it's hard and time-consuming work, and requires careful deliberation and transparent ruling.

replies(1): >>45140725 #
30. pessimizer ◴[] No.45139184[source]
> for the principle of free speech

This administration is taking a newly-formed censorship regime that was largely operated by the nepo babies of politicians running do-nothing tax-supported nonprofits, but implemented and operated by Mossad agents, and removing the nepo babies from the loop.

You can say "retard" now, but if you call somebody who executes Palestinian children a retard, you're going on a government blacklist.

edit: This post has been classified and filed, and associated with me for the rest of my life.

31. jajko ◴[] No.45139234[source]
Interest of tech companies (or more specifically their stockholders), for sure. Not so much for the long term interests of its citizens though.
32. plopilop ◴[] No.45139262{3}[source]
Oh, I know that strong emotions increase engagement, outrage being a prime candidate. I have also no issue believing that FB/TikTok/X etc aggressively engage in such tactics, e.g. [0]. But I am not aware of FB publicly acknowledging that they deliberately tune the algorithm to this effect, even though they carried some research on the effects of emotions on engagement (I would love to be proven wrong though).

But admitting FB did publicly say they manipulate their users' emotions for engagement, and a law is passed preventing that. How do you assess that the new FB algorithm is not manipulating emotions for engagement? How do you enforce your law? If you are not allowed to create outrage, are you allowed to promote posts that expose politicians corruption? Where is the limit?

Once again, I hate these algorithms. But we cannot regulate by saying "stop being evil", we need specific metrics, targets, objectives. A law too broad will ban Google as much as Facebook, and a law too narrow can be circumvented in many ways.

[0] https://www.wsj.com/tech/facebook-algorithm-change-zuckerber...

replies(1): >>45141537 #
33. woodpanel ◴[] No.45139663[source]
[flagged]
replies(1): >>45143311 #
34. socalgal2 ◴[] No.45139692[source]
Exactly why I often think I should stop reading HN
35. socalgal2 ◴[] No.45139820{3}[source]
they are? As far as I can tell they are no more regulated than anyone else.

There in the issue that a news site generally has limited number of contributors where has a social media site has an infinite number of contributors.

In either case, it seems like the same laws apply like defamation laws, fraud laws, etc apply to the authors of the posts which might be easier to target when it’s a news site as the site itself takes responsibility for the content

replies(1): >>45141152 #
36. godshatter ◴[] No.45139915[source]
I'm not a big fan of banning things like this. There's good mixed in with the bad and banning things will only lead to new social media sites rising in their place. I don't expect them to be any better.

This is basically a fight against human nature. If I could get one wish, it would be legislation that forces social media sites to explain in detail how their algorithms work. I have to believe that a company could make a profitable social media site that doesn't try all the tricks in the books to hook their users to their site and rile them up. They may not be Meta-sized, but I would think there would be a living in it.

replies(3): >>45139984 #>>45142219 #>>45147984 #
37. op00to ◴[] No.45139984[source]
I don’t think people want to understand how algorithms manipulate them.
38. op00to ◴[] No.45140015[source]
The clown also shows you pictures of how awesome everyone else is doing and asks why you are so fat and ugly and boring in comparison.
39. __s ◴[] No.45140725{5}[source]
I think part of that was volunteer moderation. Were you paid to moderate those boards? Most moderation was a form of community involvement

Recent social media (& maybe "recent" no longer applies) doesn't have this kind of community run tooling

replies(1): >>45142203 #
40. op00to ◴[] No.45140858{4}[source]
End of what?
41. ◴[] No.45140920{3}[source]
42. Karrot_Kream ◴[] No.45141003[source]
> pushes content triggering strong emotional reactions should be banned

Aren't you describing your own comment? Aren't upvotes pushing that to the top? So isn't HN the thing that needs to be banned according to your comment?

replies(3): >>45141072 #>>45143034 #>>45143090 #
43. abdullahkhalids ◴[] No.45141072[source]
No. Facebook algorithm produces different outputs for every user. HN's algorithm produces one output for all users.

They are qualitatively distinct. Facebooks' algorithm is demonstrably harmful. HN's not so much.

replies(1): >>45141092 #
44. Karrot_Kream ◴[] No.45141092{3}[source]
Do you have proof that demonstrates that FB's algorithm is more harmful than upvotes on HN or Reddit? Not that it's harmful compared to a world before FB, that it's more harmful than an upvote based algorithm.
45. rasmus-kirk ◴[] No.45141109[source]
I like this, but it also leaves the door wide open to censorship. Also this would include Youtube which would be a marked detirioration in learning.
replies(1): >>45141682 #
46. ktosobcy ◴[] No.45141152{4}[source]
Yes, they are (not sure about US).

In general the mere fact that there is limited number of contributors that are known and indicated authorship goes a long way. Also - all publishers have to register indicating who is behind particular "medium".

Contrary, social-"media" there is no accountability. Anyone can publish anything and there is basically no information who published that. You can sue but then again publishing platform has no information about the author so the process is long and convoluted.

Making social-media what it started from (network of close friends) where you only see the content they publish and requirement of actual details who is behind the particular profile (could be for pages/profiles with more than something like 10k followers, in which case - let's be honest - it's not "friend" at that point) would go a long way.

47. mschuster91 ◴[] No.45141537{4}[source]
> But we cannot regulate by saying "stop being evil", we need specific metrics, targets, objectives.

Ban any kind of provider-defined feed that is not chronological and does not include content of users the user does not follow, with the exception for clearly marked as-such advertising. Easy to write as a law, even easier to verify compliance.

replies(1): >>45144178 #
48. Krssst ◴[] No.45141682[source]
We can have freedom of expression with a regular chronological feed from selected followed users. There's no need for a smart feed that optimises whatever the entity owning the network wants.
49. rdm_blackhole ◴[] No.45141695[source]
Yes, let's give more power to the EU, the entity that's been trying to ban encryption within the EU for the last 3 years and wants to read all your messages, scan all your pictures, but pinky promise, it won't use the data to hunt down political dissidents or silence opposing views.

I am sure it's going to be swell.

Let's also require tech companies to only allow content that has been approved by the central committee for peace and tolerance (TM) while we are it!

No risk of censorship there.

replies(1): >>45143107 #
50. amelius ◴[] No.45141755[source]
Let's start with banning the monetization model.
51. diggan ◴[] No.45142203{6}[source]
> Were you paid to moderate those boards?

No, none of the moderators were paid, but I do think the ~2/3 admins were paid. But yeah, I did it purely out of the want for the forum to remain high-quality, as did most of the other moderators AFAIK.

> Recent social media (& maybe "recent" no longer applies) doesn't have this kind of community run tooling

Agree, although reddit with its "every subreddit is basically its own forum but not really" (admins still delete stuff you wouldn't + vice-versa) kind of did an extreme version of community run tooling, with the obvious end result that moderation is super unequal across reddit, and opaque.

Bluesky is worth mentioning as well, with their self-proclaimed "stackable" moderation, which is kind of some fresh air in the space. https://bsky.social/about/blog/03-12-2024-stackable-moderati...

52. strbean ◴[] No.45142219[source]
> I'm not a big fan of banning things like this.

I think this is a pretty perfect use case for banning. The harms are mostly derived from the business model. If the social media companies were banned from operating them, and the bans were evaded by DIYers, Mastodon and the like, most of the problems disappear.

When there's still money in the black market alternative, banning doesn't work well (see: narcotics).

53. jerrycruncher ◴[] No.45143034[source]
This is a really canonical example of a "Yet you participate in a society. Curious!" post. Well done.

[0] https://imgur.com/we-should-improve-society-somewhat-T6abwxn

replies(1): >>45143495 #
54. blargey ◴[] No.45143090[source]
The opposite, actually - I remember reading that HN downranks posts that have a low favorability:engagement ratio - in its case, high comment count and comparatively low votes. The reasoning being that flamebait topics inspire a disproportionate number of angry/low-substance/pile-on comments and retort-chains compared to normal topics, without garnering a corresponding increase in top-level votes.

It's imperfect, but afaik most social media does the opposite (all "engagement" is good engagement), and I imagine, say, Twitter would be much nicer if it tuned its algo to not propagate posts with an unusually high view/retweet count relative to likes.

replies(2): >>45143333 #>>45143439 #
55. Jon_Lowtek ◴[] No.45143107[source]
The EU is not a single mind, it is many party democracy. Yes there are forces in it that have been pushing for "lawful interception" for some time now. And they have always failed to ban E2E-encryption.

In the USA there exist similar forces who also introduced bills with similar ideas multiple times in the last decade. One of those is currently in congress.

56. dang ◴[] No.45143304[source]
Could you please stop posting flamebait, ideological battle comments, etc? You've unfortunately been doing it repeatedly. It's not what this site is for, and destroys what it is for.

If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit of the site more to heart, we'd be grateful.

57. dang ◴[] No.45143311{3}[source]
Please don't respond to a bad comment by breaking the site guidelines yourself. That only makes things worse.

https://news.ycombinator.com/newsguidelines.html

58. setsewerd ◴[] No.45143333{3}[source]
That's interesting, it seems like it would accidentally penalize a lot of "good" posts too, like people asking questions to better understand a topic/perspective
59. Karrot_Kream ◴[] No.45143495{3}[source]
Thanks. I try really hard. Wait was that supposed to be a backhanded compliment? No way, can't be, HN is above that kind of behavior (:

My point, overall, is that there is all the criticism of social media that excludes HN is based on vibes. And if we're about to ban social media for the EU then hopefully we have more than vibes to go off of.

60. bsder ◴[] No.45143915[source]
Agreed. If anyone in the medical community tried the stuff that Facebook and Google do, it would fail immediately at an ethics review board and/or the person would lose their medical license.
61. Mistletoe ◴[] No.45144032[source]
Beautiful description of our current life.
62. yannyu ◴[] No.45144064{3}[source]
It's one thing to have a community that has tendencies towards cynicism, skepticism, and assuming the worst. It's another thing to build an algorithm optimized for "engagement" which prioritizes polarizing content above all others because it keeps people addicted to the platform.

Maybe the problem is the websites that amplify the most controversial and problematic content because they get the most clicks, so these companies can report better DAUs and MAUs.

replies(2): >>45144444 #>>45145712 #
63. plopilop ◴[] No.45144178{5}[source]
You have banned Google, Reddit, and HN.
replies(1): >>45147101 #
64. int_19h ◴[] No.45144444{4}[source]
The problem is that humans themselves will amplify the most controversial and problematic content because anger is one of the strongest emotions.

https://www.youtube.com/watch?v=rE3j_RHkqJc

65. int_19h ◴[] No.45144482[source]
I'm very skeptical of EU censorship, but EU citizens can and should figure it out for themselves. There's no reason why we Americans should be telling them how to run their economies, nor do we have some intrinsic right for our companies to operate in any random market.
66. realz ◴[] No.45145712{4}[source]
Might I add that Facebook has also proven time and time again that they believe in zero ethics. They will happily boost a dictator’s post. They’ll happily assist a rapist win elections. They’ll happily let you sell addictive content to kids. Heck they’ll even give you easy ways to target ads to “depressed 14 year old girls” specifically.
67. mschuster91 ◴[] No.45147101{6}[source]
Google is not social media, Reddit and HN offer ranking based on karma instead of addiction algorithms.
replies(1): >>45153961 #
68. txrx0000 ◴[] No.45147812[source]
There is immense value in the ability to share realtime events with the rest of the world. If the curation algorithm is the problem, then the solution should target only that, not "BLOW IT ALL UP". There are a few ways:

1) We can build open-source clients with user-configurable client-side recommendation algorithms.

2) We can shame the people actively working to make this problem worse, especially if they make 1) or 3) harder.

3) We can build decentralized protocols like Nostr to pry social media from the hands of tech giants altogether.

These solutions are not mutually exclusive, so we should pursue all of them.

69. __MatrixMan__ ◴[] No.45148447[source]
What needs to go is advertising.

The evils of social media are not consequences of people using the internet to connect with other people, they're consequences of people using platforms where you can buy a following instead of having to earn it.

70. plopilop ◴[] No.45153961{7}[source]
None of these were in your initial law. Furthermore, karma is also addictive.