←back to thread

347 points iamnothere | 1 comments | | HN request time: 0s | source

Also: We built a resource hub to fight back against age verification https://www.eff.org/deeplinks/2025/12/age-verification-comin...
Show context
rlpb ◴[] No.46224574[source]
I'd be OK with an "I am a child" header mandated by law to be respected by service providers (eg. "adult sites" must not permit a client setting the header to proceed). On the client side, mandate that consumer devices that might reasonably be expected to be used by children (every smartphone, tablet, smart TV, etc) have parental controls that set the header. Leave it to parents to set the controls. Perhaps even hold parents culpable for not doing so, as a minimum supervision requirement, just as one may hold parents culpable for neglecting their children in other ways.

Forcing providers to divine the age of the user, or requiring an adult's identity to verify that they are not a child, is backwards, for all the reasons pointed out. But that's not the only way to "protect the children". Relying on a very minimal level of parental supervision of device use should be fine; we already expect far more than that in non-technology areas.

replies(8): >>46224965 #>>46225003 #>>46225048 #>>46225061 #>>46225433 #>>46236425 #>>46236866 #>>46241419 #
iamnothere ◴[] No.46225433[source]
If we must do something like this, I think a good solution would be an optional server header that describes the types of objectionable content that may be present (including “none”). Browsers on child devices from mainstream vendors would refuse to display any “unrated” resources without the header, and would block any resources that parents deem age-inappropriate, with strict but fair default settings that can be overridden. Adult browsers would be unaffected. Legislatures could attempt to craft laws against intentionally miscategorized sites, as doing this would be intentionally targeting kids with adult content.

There is no perfect solution that avoids destroying the internet, but this would be a pretty good solution that shelters kids from accidentally entering adult areas, and it doesn’t harm adult internet users. It also avoids sending out information about the user’s age since filtering happens on the client device.

replies(1): >>46236031 #
ars ◴[] No.46236031[source]
This exists: https://en.wikipedia.org/wiki/Platform_for_Internet_Content_...

It was derided as a "system for mass censorship", and got shot down. In hindsight a mistake, and it should have been implemented - it was completely voluntary by the user.

replies(1): >>46238061 #
1. iamnothere ◴[] No.46238061{3}[source]
It’s close, but I see why it failed. There’s no need to include licensing/rights management in there. Also this was before pervasive HTTPS, so it would have been possible for governments and ISPs to snoop the info and possibly block it. If it could be limited to just content ratings, and kept private behind SSL, this isn’t a bad approach.

But this also needs some kind of guarantee that lawmakers won’t try to force it on FOSS projects that want to operate outside the system. And that companies like Google won’t use EEE to gradually expand this header into other areas and eventually cut off consenting adults who want to operate outside this system. I’m not sure if it is possible to get those guarantees.