←back to thread

286 points saikatsg | 3 comments | | HN request time: 0.529s | source
Show context
ktosobcy ◴[] No.45137736[source]
EU should to the same (FB & X).

In general anything that has "algorytmic content ordering" that pushes content triggering strong emotional reactions should be banned and burned to the ground.

replies(16): >>45137822 #>>45138118 #>>45138157 #>>45138164 #>>45138363 #>>45138553 #>>45138596 #>>45138634 #>>45138850 #>>45139915 #>>45141003 #>>45141109 #>>45141695 #>>45141755 #>>45143915 #>>45147812 #
1. godshatter ◴[] No.45139915[source]
I'm not a big fan of banning things like this. There's good mixed in with the bad and banning things will only lead to new social media sites rising in their place. I don't expect them to be any better.

This is basically a fight against human nature. If I could get one wish, it would be legislation that forces social media sites to explain in detail how their algorithms work. I have to believe that a company could make a profitable social media site that doesn't try all the tricks in the books to hook their users to their site and rile them up. They may not be Meta-sized, but I would think there would be a living in it.

replies(3): >>45139984 #>>45142219 #>>45147984 #
2. op00to ◴[] No.45139984[source]
I don’t think people want to understand how algorithms manipulate them.
3. strbean ◴[] No.45142219[source]
> I'm not a big fan of banning things like this.

I think this is a pretty perfect use case for banning. The harms are mostly derived from the business model. If the social media companies were banned from operating them, and the bans were evaded by DIYers, Mastodon and the like, most of the problems disappear.

When there's still money in the black market alternative, banning doesn't work well (see: narcotics).