←back to thread

582 points SweetSoftPillow | 3 comments | | HN request time: 0s | source
Show context
jraph ◴[] No.45668493[source]
Browsers have no way to determine what code or cookie is tracking and what isn't, and if websites are not targeted, they don't have any incentive to tell browsers "oh, this is for tracking, and this, no, it's not for tracking".

The best we have is heuristics content blockers currently use. But heuristics are not good enough for complying to such laws because there's no guarantee they work in 100% of the cases.

It follows that such laws can't target browsers and not websites.

replies(1): >>45668650 #
skeezyjefferson ◴[] No.45668650[source]
Wasnt this a benefit of the semantic web we were pushing for? Standardized tags exactly for stuff like this? Just another example of the mess that web dev is - trying to coerce a markup language into a fully fledged programming language.

OP has a nice idea but hes short on technical details, which in this case is where the devil resides.

replies(1): >>45668795 #
1. jraph ◴[] No.45668795[source]
As much I like the semantic web, you can embed tracking parameters in images and links put in a perfectly semantic HTML structure :-)

I think we need strong privacy laws, removing the incentive to track, or both, I don't see a technical way around.

replies(1): >>45671018 #
2. skeezyjefferson ◴[] No.45671018[source]
is there a standard for those tracking parameters?
replies(1): >>45671451 #
3. jraph ◴[] No.45671451[source]
There are some usual suspects like the utm_* parameters, but a website could be using whatever it wants.

Actually, you don't event need parameters to track, you could just use the IP of the requester and for instance do some IP geolocation.