←back to thread

668 points wildmusings | 3 comments | | HN request time: 0s | source
Show context
soup10 ◴[] No.13027227[source]
Reddit has large radical left and radical right communities that are constantly fighting and overwhelm the site with political drama and propaganda. If they don't do something about it they are going to lose the users that come there to look at cats.
replies(5): >>13027266 #>>13027301 #>>13027422 #>>13027619 #>>13027681 #
ng12 ◴[] No.13027619[source]
That's the problem -- there should be no "Reddit". The only interesting part of Reddit are the communities and there's no reason why the_donald and SRS can't exist under the same reddit.com domain. I really wish /u/spez et al. would take a hands-off approach except for cases where users are breaking the law.
replies(3): >>13027709 #>>13027740 #>>13027841 #
skybrian ◴[] No.13027740[source]
Yep, that's the libertarian dream. It makes me nostalgic for the 90's when the web was new.

Now we know that spam and abuse make any large Internet forum suck. Your choice: moderation or cesspool.

Also, getting rid of the really extreme filth on the Internet is no fun and people generally have to be paid to do it, which is one of the things that keeps larger social networks in business.

replies(4): >>13027753 #>>13027865 #>>13028447 #>>13036741 #
1. grzm ◴[] No.13027753[source]
"no fun" is an understatement. It can be psychologically damaging. I found this article really revealing:

http://www.theverge.com/2016/4/13/11387934/internet-moderato...

replies(2): >>13028100 #>>13028101 #
2. trymas ◴[] No.13028100[source]
That's very long. Maybe a TL;DR?
replies(1): >>13028160 #
3. grzm ◴[] No.13028160[source]
Here's an excerpt from the article:

In an October 2014 Wired story, Adrian Chen documented the work of front line moderators operating in modern-day sweatshops. In Manila, Chen witnessed a secret "army of workers employed to soak up the worst of humanity in order to protect the rest of us." Media coverage and researchers have compared their work to garbage collection, but the work they perform is critical to preserving any sense of decency and safety online, and literally saves lives — often those of children. For front-line moderators, these jobs can be crippling. Beth Medina, who runs a program called SHIFT (Supporting Heroes in Mental Health Foundational Training), which has provided resilience training to Internet Crimes Against Children teams since 2009, details the severe health costs of sustained exposure to toxic images: isolation, relational difficulties, burnout, depression, substance abuse, and anxiety. "There are inherent difficulties doing this kind of work," Chen said, "because the material is so traumatic."

The whole thing is worth a read, in a couple of sessions if necessary.