←back to thread

139 points stubish | 4 comments | | HN request time: 0.629s | source
Show context
jackvalentine ◴[] No.44439355[source]
Australians are broadly supportive of these kind of actions - there is a view that foreign internet behemoths have failed to moderate for themselves and will therefore have moderation imposed on them however imperfect.

Can’t say I blame them.

replies(3): >>44439415 #>>44439676 #>>44439817 #
AnthonyMouse ◴[] No.44439817[source]
> there is a view that foreign internet behemoths have failed to moderate for themselves and will therefore have moderation imposed on them however imperfect.

This view is manufactured. The premise is that better moderation is available and despite that, literally no one is choosing to do it. The fact is that moderation is hard and in particular excluding all actually bad things without also having a catastrophically high false positive rate is infeasible.

But the people who are the primary victims of the false positives and the people who want the bad stuff fully censored aren't all the same people, and then the second group likes to pretend that there is a magic solution that doesn't throw the first group under the bus, so they can throw the first group under the bus.

replies(5): >>44439891 #>>44439944 #>>44440013 #>>44440547 #>>44441786 #
Nursie ◴[] No.44440547[source]
> The fact is that moderation is hard

Moderation is hard when you prioritise growth and ad revenue over moderation, certainly.

We know a good solution - throw a lot of manpower at it. That may not be feasible for the giant platforms...

Oh no.

replies(2): >>44440705 #>>44442149 #
AnthonyMouse ◴[] No.44440705[source]
This is the weirdest theory. The premise is that you admit the huge corporations with billions of dollars don't have the resources to pay moderators to contend with the professional-grade malicious content by profitable criminal syndicates, but some tiny forum is supposed to be able to get it perfect so they don't go to jail?
replies(2): >>44440792 #>>44440867 #
1. Nursie ◴[] No.44440792[source]
> The premise is that you admit the huge corporations with billions of dollars don't have the resources to pay moderator

My contention is more that they don’t have the will, because it would impact profits and that it’s possible that if they did implement effective moderation at scale it might hurt their bottom line so much they are unable to keep operating.

Further, that I would not lament such a passing.

I’m not saying tiny forums are some sort of panacea, merely that huge operations should not be able to get away with (for example) blatant fraudulent advertising on their platforms, on the basis that “we can’t possibly look at all of it”.

Find a way, or stop operating that service.

replies(1): >>44441248 #
2. AnthonyMouse ◴[] No.44441248[source]
> My contention is more that they don’t have the will, because it would impact profits and that it’s possible that if they did implement effective moderation at scale it might hurt their bottom line so much they are unable to keep operating.

Is the theory supposed to be that the moderation would cost them users, or that the cost of paying for the moderation would cut too much into their profits?

Because the first one doesn't make a lot of sense, the perpetrators of these crimes are a trivial minority of their user base that inherently cost more in trouble than they're worth in revenue.

And the problem with the second one is that the cost of doing it properly would not only cut into the bottom line but put them deep into the red on a permanent basis, and then it's not so much a matter of unwillingness but inability.

> I’m not saying tiny forums are some sort of panacea, merely that huge operations should not be able to get away with (for example) blatant fraudulent advertising on their platforms, on the basis that “we can’t possibly look at all of it”.

Should the small forums be able to get away with it though? Because they're the ones even more likely to be operating with a third party ad network they neither have visibility into nor have the leverage to influence.

> Further, that I would not lament such a passing.

If Facebook was vaporized and replaced with some kind of large non-profit or decentralized system or just a less invasive corporation, would I cheer? Probably.

But if every social network was eliminated and replaced with nothing... not so much.

replies(2): >>44441403 #>>44441554 #
3. Nursie ◴[] No.44441403[source]
> the cost of paying for the moderation would cut too much into their profits?

This one. Not just in terms of needing to take on staff, but it would also cut into their bottom line in terms of not being able to take money from bad-faith operators.

> And the problem with the second one is that the cost of doing it properly would not only cut into the bottom line but put them deep into the red on a permanent basis, and then it's not so much a matter of unwillingness but inability.

Inability to do something properly and make a commercial success of it, is a 'you' problem.

Take meta and their ads - they've built a system in which it's possible to register and upload ads and show them to users, more or less instantly with more or less zero human oversight. There are various filters to try and catch stuff, but they're imperfect, so they supply fraudulent ads to their users all the time - fake celebrity endorsements, various things that fall foul of advertising standards. Some just outright scams. (Local family store you never heard of is closing down! So sad! Buy our dropshipped crap from aliexpress at 8x the price!)

To properly, fully fix this they would need to verify advertisers and review ads before they go live. This is going to slow down delivery, require a moderate sized army of reviewers and it's going to lose them revenue from the scammers. So many disincentives. So they say "This is impossible", but what they mean is "It is impossible to comply with the law and continue to rake in the huge profits we're used to". They may even mean "It is impossible to comply with the law and continue to run facebook".

OK, that's a classic 'you' problem. (Or it should be). It's not really any different to "My chemical plant can't afford to continue to operate unless I'm allowed to dump toxic byproducts in the river". OK, you can't afford to operate, and if you keep doing it anyway, we're going to sanction you. So ... Bye then?

> Should the small forums be able to get away with it though?

This is not really part of my argument. I don't think they should, no. But again - if they can't control what's being delivered through their site and there's evidence it contravenes the law, that's a them problem and they should stop using those third party networks until the networks can show they comply properly.

> if every social network was eliminated and replaced with nothing... not so much.

Maybe it's time to find a new funding model. It's bad enough having a funding model based on advertising. I's worse having one based on throwing ad messages at people cheap and fast without even checking they meets basic legal standards. But here we are.

I realise this whole thing is a bit off-topic as the discussion is about age-verification and content moderation, and I've strayed heavily into ad models....

4. pferde ◴[] No.44441554[source]
Smaller forums are more likely to handle moderation effectively and in a timely manner. I frequent a few such forums, and have seen consistently good moderating for many years.