←back to thread

1525 points garyclarke27 | 9 comments | | HN request time: 0.93s | source | bottom
Show context
square_usual ◴[] No.23219602[source]
This is like Google banning the YouTube app from the play store for having videos about covid that aren't from government sources. Insane stuff.
replies(6): >>23219632 #>>23219725 #>>23219733 #>>23219792 #>>23219901 #>>23224664 #
1. laumars ◴[] No.23219733[source]
That’s an interesting example because YouTube are also removing COVID-19 content from some disreputable sources.

To be clear: I’m not defending Google, I also don’t agree with the podcast takedown.

replies(2): >>23220259 #>>23221227 #
2. jeltz ◴[] No.23220259[source]
I found this very quickly: https://www.youtube.com/watch?v=bgln8rjuIMU&feature=share
replies(1): >>23220864 #
3. laumars ◴[] No.23220864[source]
I didn't say their algorithm was flawless, just that they are removing content.

Here's a few citations:

https://www.bbc.com/news/technology-52388586

https://fee.org/articles/youtube-to-ban-content-that-contrad...

https://www.theblaze.com/news/youtube-will-remove-any-corona...

https://www.socialmediatoday.com/news/youtube-ramps-up-actio...

These haven't been cherry picked, they're the top 4 results on a DDG search.

Your example only demonstrates that you don't understand the difficulty in censoring content on social platforms as enormous as YouTube; not that Google don't have a policy that prohibits such content.

replies(1): >>23224654 #
4. sp332 ◴[] No.23221227[source]
But they're not asking the podcast hosts to remove the content. They're penalizing a podcast player app, which has no ability to remove bad content.
replies(1): >>23221293 #
5. laumars ◴[] No.23221293[source]
Again, I didn’t say I agree with the apps removal, just that Google we’re at least being internally consistent.

With regards to the app removal, do we know it is a Google management decision and not the work of an overzealous app reviewer or an algorithm (the latter being the way Google usually operate)?

The reason I talk about internal consistency is because it is hard getting the right balance between removing stuff that should be vs stuff that shouldn’t and that problem is only magnified when when doing so at scale. So if this were a management then I totally understand the pitchforks but if it’s a false positive an in algorithm then hopefully Google will rectify and we can all go back to moaning about Electron or whatever the next meme is.

replies(1): >>23221483 #
6. sp332 ◴[] No.23221483{3}[source]
I disagree that they are being consistent. An almost-equivalent move would be suspending the entire YouTube app, and making a new entry in the Play Store after they have removed all offending content. But at least YouTube has some ability to remove content, so that would actually make more sense than suspending Podcast Addict which plays third-party-hosted media.
replies(1): >>23222167 #
7. laumars ◴[] No.23222167{4}[source]
They’re suspending accounts on YouTube and pulling videos, just like they’re doing with apps on their Play Store. They can’t moderate the podcasts posted but they can moderate the apps and videos posted. So of course it’s internally consistent.

The real question is whether it should be consistently upheld without exceptions and the answer to that is obviously “no” because some apps are hubs to user curated content like that podcast app is.

As for your comment about them suspending their entire YouTube app, you talk as if this was a management exec decision rather than rouge judgement that will inevitably get overturned.

Google isn’t a single entity. It’s a collection of people and algorithms all making their own judgements based on Google’s policies. Sometimes they miss stuff they should moderate and sometimes they get overzealous and remove content they shouldn’t. There is such a large grey area and scope for personal judgement that you have to expect some unpopular verdicts from time to time. It’s shit but no two situations are identical so it’s a problem that’s impossible to avoid. The real tell is whether Google reverse the decision once the complaint gets escalated.

8. jeltz ◴[] No.23224654{3}[source]
The one I linked is from a Swedish conspiracy theory channel with 16k subscribers and it has been up for 2 months, gotten 40k views (huge for Swedish language content) and does not try to hide itself at all ("WakeUpGlobe SE" and uses "corona" in the title). It would be trivial for a human to find this.
replies(1): >>23225041 #
9. laumars ◴[] No.23225041{4}[source]
You can't blanket ban terms otherwise you'd end up accidentally banning far more false positives than a few podcast apps. For example "corona" is a pretty broad term -- it would be like banning Nazi content but using the word "German" from "National Socialist German Workers' Party" as your identifier then wondering why half the German language videos disappear.

> It would be trivial for a human to find this.

Someone literate in Swedish maybe (to gather the context of those key words) but it isn't humans which do this.

Google are big into automation to the extent that they have machines doing their review. You might consider that wrong but then you have to ask yourself how many humans would it take to moderate a platform as large as YouTube. I bet you that whatever number you come up wouldn't be enough and someone else would say "I found another video that was trivial to find, Google don't hire enough platform moderators!"

Plus it's a pretty horrible job being a professional moderator and spending your whole day reviewing the dregs of society. I've read reports where people who've done it had said it's had a very real negative impact on their mental health.

As I said earlier, fixing problems like this at scale is insanely hard. It's one of those things that might seem easy at a superficial level but it's fraught with errors and you can guarantee that whatever decision the moderator makes (be that human or algorithm) someone will be unhappy and claim it's not fair.