←back to thread

52 points layer8 | 1 comments | | HN request time: 0.237s | source
1. w10-1 ◴[] No.43687396[source]
This sounds pretty bland and meaningless, but is it?

tldr: Privacy protections seems personal, but not collective:

- For short genmoji prompts, respond with false positives so large numbers are required

- For longer writing, generate texts and match their embedding signatures with opted-in samples

i.e., personal privacy is preserved, but one could likely still distinguish populations if not industries and use-cases: social media users vs. students vs. marketers, conservatives vs. progressives, etc. These categories themselves have meaning because they carry useful associations: marketers more likely to do x, conservatives y, etc. And that information is very valuable, unless it's widely known.

No one likes being personally targeted: it's weird to get ads for something you just searched for. But it might also be problematic for society to have groups be characterized, particularly to the extent that the facts are non-obvious (e.g., if marketers decide within a minute v. developers taking days). To the extent the information is valuable, it's more so if it private and limited (i.e., preserves the information asymmetry), which means the collectors of that information have an incentive to keep it private.

So even if Apple broadly has the best of intentions, even this data collection creates a moral hazard, a valuable resource that enterprising people can tap. It adds nothing to Apple's bottom line, but could be someone's life's work and salary.

Could it be mitigated by a commitment to publish all their conclusions? (hmm: but the analyses are often borderline insignificant) Not clear.

Bottom line for me: I'm now less worried about losing personal privacy than about technologies for characterizing and manipulating groups of consumers or voters. But it's impossible for Apple to characterize users at scale for their own quality assessment -- and thus to maintain their product excellence -- without doing exactly that.

Oy!