←back to thread

1957 points apokryptein | 2 comments | | HN request time: 0s | source
Show context
theptip ◴[] No.42910331[source]
> Why do they need to know my screen brightness, memory amount, current volume and if I'm wearing headphones?

This is clearly adding entropy to de-anonymize users between apps, rather than to add specificity to ad bids.

replies(9): >>42910433 #>>42910476 #>>42910497 #>>42910702 #>>42914420 #>>42915971 #>>42916080 #>>42919652 #>>42937487 #
jmward01 ◴[] No.42910476[source]
It would be amazing if you could build and send fake profiles of this information to create fake browser fingerprints and help track the trackers. Similarly, creating a lot of random noise here may help hide the true signal, or at least make their job a lot harder.
replies(1): >>42910540 #
nickburns ◴[] No.42910540[source]
Unfortunately fingerprinting prevention/resistance tactics become a readily identifiable signal unto themselves. I.e., the 'random noise' becomes fingerprintable if not widely utilized.

Everyone would need to be generating the same 'random noise' for any such tactics to be truly effective.

replies(2): >>42910643 #>>42913074 #
jmward01 ◴[] No.42910643[source]
A sufficient number of people would need to, not everyone. And if I were the only one then tracking companies wouldn't adjust for just me. Basically, if this were to catch on then ad trackers wouldn't adjust until there was enough traffic for it to work. Also, that doesn't negate the ability to use this to create fake credentials that aids in tracking ads back to their source.
replies(1): >>42912335 #
sebastiennight ◴[] No.42912335[source]
They don't need to adjust.

Here's a real-life example: You show up alone at the airport with a full-face mask and gray coveralls. You are perfectly hidden. But you are the only such hidden person, and there is still old cam footage of you in the airport parking lot, putting on the clothes. The surveillance team can let you act anonymous all you want. They still know who you are, because your disguise IS the unique fingerprint.

Now the scenario you're shooting for here is:

10 people are now walking around the airport in full-face masks and gray coveralls. You think, "well now they DO NOT know if it's ME, or some terrorist, or some random other guy from HN!"

But really, they still have this super-specific fingerprint (there are still less than 1 person in a million with this disguise) and all they need is ONE identifying characteristic (you're taller than the other masked people, maybe) to know who's who.

They didn't need to adjust their system one bit.

replies(3): >>42912878 #>>42913991 #>>42914311 #
1. theptip ◴[] No.42913991[source]
I think this is a slightly different case no? If the ad network is using a very high precision variable to soft-link anonymized accounts, then randomizing the values between apps should break that.

Your analogy applies more to things like trying to anonymize your traffic with Tor, where using such an anonymizer flags your IP as doing something weird vs other users. I’m not convinced simply fuzzing the values would be detectable, assuming you pick values that other real users could pick.

replies(1): >>42918852 #
2. amanda99 ◴[] No.42918852[source]
I'm sure the ad networks do a lot more than use high precision variables for soft-linking.

These are professional networks with a ton of capital thrown behind them. They have pretty decent algorithms, heuristics, etc; and you don't make money (compared to the other data correlation teams) if you do simple dumb stuff. I'm certain they take into account those trying to be privacy-conscious, if only to increase their match rates to be competitive.