Most active commenters
  • fauigerzigerk(4)
  • Spivak(3)

←back to thread

1525 points garyclarke27 | 30 comments | | HN request time: 1.568s | source | bottom
1. lord_erasmus ◴[] No.23219890[source]
In most of these stories featuring Google abusing their power to remove apps, it's usually a matter of some automated tool gone wrong and the problem is solved a couple of days later. But this time it's different, they are actually asking developers to censor themselves if they are not affiliated with a gov.
replies(5): >>23220004 #>>23220611 #>>23220957 #>>23220962 #>>23221130 #
2. gundmc ◴[] No.23220004[source]
What makes you think this is something other than another (awful) high profile case of automation gone wrong?
replies(3): >>23220108 #>>23220199 #>>23221080 #
3. fauigerzigerk ◴[] No.23220108[source]
Because their action seems consistent with their stated intention of banning all non-official speech on Covid-19?

It could still be reversed if they feel public opinion swings the other way. That wouldn't mean it's automation gone wrong.

replies(2): >>23220303 #>>23221604 #
4. thu2111 ◴[] No.23220199[source]
The tool is clearly implementing the policy as described. It's just a policy that's stupid, arrogant and naive.
5. random32840 ◴[] No.23220303{3}[source]
It may be automated based on frequency of reports, but either way this is unlikely to be company policy. The people who make these decisions are relatively low-level employees following a company guidebook. The guidebook says it has to go? It has to go. The employee doesn't want to get fired.

I doubt it'll stick.

replies(1): >>23220339 #
6. fauigerzigerk ◴[] No.23220339{4}[source]
>...but either way this is unlikely to be company policy

Perhaps you haven't seen the article because it's behind an Apple News link. There's a screenshot of a message stating company policy as follows:

"Pursuant to Section 8.3 of the Developer Agreement and the Enforcement policy, apps referencing Covid-19, or related terms, in any form will only be approved for distribution on Google Play if they are published, commissioned or authorized by official government entities or public health organizations"

replies(3): >>23220810 #>>23221168 #>>23222515 #
7. imron ◴[] No.23220611[source]
> and the problem is solved a couple of days later.

If the developer is lucky enough to hit the front page of HN.

replies(1): >>23223518 #
8. random32840 ◴[] No.23220810{5}[source]
That's not what I meant. I mean, it's unlikely that someone high up in the company decided to snipe this app. It's probably a low-level employee following the formal rulebook a little too much to the letter.
9. swiley ◴[] No.23220957[source]
Meh. If I left an automation roaming the streets and it ate someone’s kid I would get in trouble.

If google’s automations are broken and they harm the world then they need to be responsible and fix them or replace them with people.

replies(2): >>23222115 #>>23222761 #
10. ◴[] No.23220962[source]
11. gitgud ◴[] No.23221080[source]
Surely Google's automated tools wouldn't automatically suspend an app that has 5 stars and over 500,000 reviews

.... surely

replies(2): >>23221210 #>>23223654 #
12. specialist ◴[] No.23221130[source]
Automated jurisprudence and adjudication seems to be suboptimal.

Isn't the elimination of human judgement one reason we all hate bureaucracies?

13. mc32 ◴[] No.23221168{5}[source]
We’re stuck between a rock and a hard place. With new diseases or any new phenomena knowledge changes frequently. Oftentimes official organs are lagging and independent voices are at the forefront and have the freshest information.

Unfortunately both of the above can have counter agenda (could have good intentions such as calming hoarding), even worse is that you can have kooks, active disinformation, lulz, etc. There is no good answer to this.

replies(1): >>23221541 #
14. sp332 ◴[] No.23221210{3}[source]
There have been too many cases of a popular app being bought by scammers and repurposed. They can't exempt an app from being suspended just because it's popular.
replies(1): >>23221365 #
15. Dylan16807 ◴[] No.23221365{4}[source]
Exempting it from automatic suspension is not exempting it from suspension.
replies(1): >>23227872 #
16. fauigerzigerk ◴[] No.23221541{6}[source]
I think our traditional answer was good enough. Trust the people to weigh a wide range of information and opinions. Delegate more detailed analysis and decision making to political representatives and independent institutions.

It's not like there wasn't any anti-scientific rubbish, hate speech and manipulation in book shops, newspapers and on TV. Entire countries used to be run by media barons (literally or effectively).

Regarding Covid specifically, I find it pretty ironic that free speech should be restricted to government officials by an oligopoly of US corporations while the head of the US government routinely spreads anti-scientific disinformation on that very subject (and on others).

replies(1): >>23221782 #
17. cwhiz ◴[] No.23221604{3}[source]
Except for the Covid-19 podcast on the Google podcast app.
18. mc32 ◴[] No.23221782{7}[source]
I agree. Not only the US but even the WHO is very susceptible to manipulation (it’s not human transmissible, then yes it is, masks don’t work, then yes they do). We also see politics dictate policy (viz Taiwan). Then Gates alarms almost everyone with his “disease free certificates”.
19. buttersbrian ◴[] No.23222115[source]
But automating something that might eat a kid, versus inadvertently blocking an app access very very different. This why no one is tackling a nanny bot --- repercussions.

Agreed on the last part.

replies(1): >>23226998 #
20. vageli ◴[] No.23222515{5}[source]
> >...but either way this is unlikely to be company policy

> Perhaps you haven't seen the article because it's behind an Apple News link. There's a screenshot of a message stating company policy as follows:

> "Pursuant to Section 8.3 of the Developer Agreement and the Enforcement policy, apps referencing Covid-19, or related terms, in any form will only be approved for distribution on Google Play if they are published, commissioned or authorized by official government entities or public health organizations"

How does their own browser not run afoul of this policy?

replies(2): >>23223577 #>>23224296 #
21. carapace ◴[] No.23222761[source]
Uber killed Elaine Herzberg and didn't get in trouble.
22. Spivak ◴[] No.23223518[source]
This isn't as weird as it seems in this new weird world. The "most official" support channel for Spotify is their Twitter handle while their email form is the secondary option.

Letting internet outrage drive the support queue is oddly pragmatic.

replies(1): >>23225002 #
23. Spivak ◴[] No.23223577{6}[source]
Because policies are written and interpreted by humans that have the common sense to understand that there is no connection between the browser app and the content it displays.

This entire thread is a discussion about Google's automatic tool flagging a podcast browser erroneously. If Google's response was to uphold the suspension then there would be an actual story, but they won't, and so it isn't.

24. Spivak ◴[] No.23223654{3}[source]
That kind "popularity content" policy will never work because because then scammers will just buy apps with good reputation. This describes the entire browser extension marketplace. Any extension with a good reputation and a broad permission set will sell for a good bit of money.
replies(1): >>23240115 #
25. fauigerzigerk ◴[] No.23224296{6}[source]
Podcast Addict builds an index of podcasts that it makes available to users whereas Chrome does not influence in any way what content users may want to view.

But your question is of course very apt when it comes to the Google Search app or Google's own podcast app.

There used to be this idea (a good idea in my view) that building a search index is a neutral activity that does not come with any editorial responsibility for the content.

Google used to fight for that idea but unfortunately lawmakers (and I think the majority of the population) have very firmly taken the opposite view.

I think that's what's ultimately at the core of this defensive "when in doubt, ban it!" attitude that was built into automatic content filtering tools and hammered into the heads of reviewers.

There are still gaps - the most glaring one being Google Search - but I think Google has largely given up that struggle in favour of avoiding billions in fines

26. X6S1x6Okd1st ◴[] No.23225002{3}[source]
It's not pragmatic, its just the bare minimum. It turns out that if you want good support you have to scale your support to the number of users.
27. swiley ◴[] No.23226998{3}[source]
Ok maybe an automation that blocked a parking lot is a better analogy.
28. Technetium ◴[] No.23227872{5}[source]
I would rather apps be automatically suspended and the install marked as potentially dangerous than have my device goatse'd to a malicious developer for any amount of time.
replies(1): >>23228759 #
29. Dylan16807 ◴[] No.23228759{6}[source]
That can be accommodated. Automatic suspension that lasts 20-30 minutes while someone on the team looks into the case and makes a human judgement.

But a full suspension, on a popular app, without rapid human review? That shouldn't happen.

Also, this wasn't up to the "install marked as dangerous" level. They just prevented new installs. In a situation like that, there's no need to act instantly.

30. saagarjha ◴[] No.23240115{4}[source]
That sounds like a spot where a human can step in?