Most active commenters
  • rlpb(5)
  • iamnothere(3)
  • taeric(3)

←back to thread

347 points iamnothere | 38 comments | | HN request time: 1.74s | source | bottom

Also: We built a resource hub to fight back against age verification https://www.eff.org/deeplinks/2025/12/age-verification-comin...
1. rlpb ◴[] No.46224574[source]
I'd be OK with an "I am a child" header mandated by law to be respected by service providers (eg. "adult sites" must not permit a client setting the header to proceed). On the client side, mandate that consumer devices that might reasonably be expected to be used by children (every smartphone, tablet, smart TV, etc) have parental controls that set the header. Leave it to parents to set the controls. Perhaps even hold parents culpable for not doing so, as a minimum supervision requirement, just as one may hold parents culpable for neglecting their children in other ways.

Forcing providers to divine the age of the user, or requiring an adult's identity to verify that they are not a child, is backwards, for all the reasons pointed out. But that's not the only way to "protect the children". Relying on a very minimal level of parental supervision of device use should be fine; we already expect far more than that in non-technology areas.

replies(8): >>46224965 #>>46225003 #>>46225048 #>>46225061 #>>46225433 #>>46236425 #>>46236866 #>>46241419 #
2. ProjectArcturis ◴[] No.46224965[source]
I'm not sure that making parents legally culpable for their kids being smart enough to download a new browser is LESS government intrusion.
replies(3): >>46225123 #>>46225538 #>>46238266 #
3. hypeatei ◴[] No.46225003[source]
Okay, so the HTTP header idea seems like it would have two issues:

1) Given that it just says you're a "child", how does that work across jurisdictions where the adult age may not be 18?

2) It seems like it could be abused by fingerprinters, ad services, and even hostile websites that want to show inappropriate content to children.

replies(2): >>46225057 #>>46225393 #
4. bena ◴[] No.46225048[source]
I am a Russian proxy site, I make requests for you without the header. I serve you the content because I don't care about following American laws.

Alternatively, just use an older browser that doesn't serve the header.

If anything, you'd want the reverse. A header that serves as a disclaimer saying "I'm an adult, you can serve me anything" and then the host would only serve if the browser sends that header. And you'd have to turn it on through the settings/parental controls.

Now, this doesn't handle the proxy situation. You could still have a proxy site that served the request with the header for you, but there's not much you can do about that regardless.

replies(1): >>46225354 #
5. phantasmish ◴[] No.46225057[source]
> 1) Given that it just says you're a "child", how does that work across jurisdictions where the adult age may not be 18?

It's a client-side flag saying "treat this request as coming from a child (whatever that means to you)". I don't follow what the jurisdiction concern is.

[EDIT] Oooooh you mean if a child is legally 18 where the server is, but 16 where the client is. But the header could be un-set for a 5-year-old, too, so I don't think that much matters. The idea would be to empower parents to set a policy that flags requests from their kids as coming from a child. If they fail to do that, I suppose that'd be on them.

replies(1): >>46225109 #
6. pembrook ◴[] No.46225061[source]
> Perhaps even hold parents culpable for not doing so, as a minimum supervision requirement

Even the idea of prosecuting parents for allowing their child to access 'information,' no matter what that information is, just sounds like asking for 1984-style insanity.

A good rule of thumb when creating laws: imagine someone with opposite political views from yours applying said law at their discretion (because it will happen at some point!).

Another good question to ask yourself: is this really a severe enough problem that government needs to apply authoritarian control via its monopoly on violence to try to solve? Or is it just something I'm abstractly worried about because some pseudo-intellectuals are doing media tours to try to sell books by inciting moral panic?

As with every generation who is constantly worried about what "kids these days" are up to, it's highly highly likely the kids will be fine.

The worrying is a good instinct, but when it becomes an irrational media hysteria (the phase we're in for the millennial generation who've had kids and are becoming their parents), it creates perverse incentives and leads to dumb outcomes.

The truth is the young are more adaptable than the old. It's the adults we need to worry about.

replies(1): >>46225439 #
7. hypeatei ◴[] No.46225109{3}[source]
The concern is that websites have no way to tell the actual age in this scenario so you'd be potentially inconveniencing and/or blocking legitimate users (according to the server jurisdiction's rules)

It doesn't seem sufficient, and would probably lead to age verification laws anyway.

replies(1): >>46225245 #
8. e40 ◴[] No.46225123[source]
It could be added at the router? The child's computer could be identified and this header added, in a MITM situation... but, maybe that would be easy to defeat, by replacing the cert on the client? Not my area of expertise... really just asking...
9. embedding-shape ◴[] No.46225245{4}[source]
No, it doesn't seem like that be a problem.

Say you're a parent, with child, living in country A where someone becomes an adult when they're 18. Once the child is 18, they'll use their own devices/browsers/whatever, and the flag is no longer set. But before that, the flag is set.

Now in country B or in country C it doesn't matter that the age of becoming an adult is 15 and 30. Because the flag is set locally on the clients device, all they need to do is block requests with the flag, and assume it's faithful. Then other parents in country B or country C set/unset the flag on their devices when it's appropriate.

No need to tell actual ages, and a way for services to say "this is not for children", and parents are still responsible for their own children. Sounds actually pretty OK to me.

replies(1): >>46227341 #
10. rlpb ◴[] No.46225354[source]
> I am a Russian proxy site, I make requests for you without the header. I serve you the content because I don't care about following American laws.

That's no different to a law mandating identification-based age verification though. A site in a different jurisdiction can ignore that just the same.

replies(1): >>46232037 #
11. rlpb ◴[] No.46225393[source]
> 1) Given that it just says you're a "child", how does that work across jurisdictions where the adult age may not be 18?

So namespace it then. "I'm a child as defined by the $country_code government". It's no more of a challenge than what identity-based age verification already needs to do.

> 2) It seems like it could be abused by fingerprinters, ad services, and even hostile websites that want to show inappropriate content to children.

This is still strictly better than identify-based age verification. Hostile or illegal sites can already do this anyway. Adding a single boolean flag which a large proportion of users are expected to have set isn't adding any significant fingerprinting information.

12. iamnothere ◴[] No.46225433[source]
If we must do something like this, I think a good solution would be an optional server header that describes the types of objectionable content that may be present (including “none”). Browsers on child devices from mainstream vendors would refuse to display any “unrated” resources without the header, and would block any resources that parents deem age-inappropriate, with strict but fair default settings that can be overridden. Adult browsers would be unaffected. Legislatures could attempt to craft laws against intentionally miscategorized sites, as doing this would be intentionally targeting kids with adult content.

There is no perfect solution that avoids destroying the internet, but this would be a pretty good solution that shelters kids from accidentally entering adult areas, and it doesn’t harm adult internet users. It also avoids sending out information about the user’s age since filtering happens on the client device.

replies(1): >>46236031 #
13. rlpb ◴[] No.46225439[source]
> Even the idea of prosecuting parents for allowing their child to access 'information,' no matter what that information is, just sounds like asking for 1984-style insanity.

This assumes an absolutist approach to enforcement, which I did not advocate and is not a fundamental part of my proposed solution. In any case, the law already has to make a subjective decision in non-technology areas. It would be no different here. Courts would be able to consider the surrounding context, and over time set precedents for what does and does not cross the bar in a way that society considers acceptable.

replies(2): >>46225532 #>>46237553 #
14. pembrook ◴[] No.46225532{3}[source]
But what if we didn't collectively spend $billions of dollars and hundreds of thousands of hours battling with money, lobbyists, lawyers, judges and political campaigns over what is largely a moral panic?

What could humanity do instead with all that time and resources?

I know the US is a nation built by lawyers, for lawyers, but this is both its best strength and worst weakness. Sometimes it's in everyones best interest to accept the additional risks individually as opposed to bubble wrapping everything in legislation and expanding the scope of the corrupt lawyer-industrial complex.

Maybe the lawyers could use the extra time fixing something actually important like healthcare or education instead.

15. rlpb ◴[] No.46225538[source]
There's no reason to hold the parents culpable. It would be up to the device manufacturer to ensure that this isn't possible on a system that has parental controls enabled. This is already a solved problem - see how MDM solutions do it, and see Apple's ban on alternative browsers.

It's not even necessary to block parents from giving their children Linux desktops or whatever. It'll largely solve the problem if parents are merely expected to enable parental controls on devices that have the capability.

16. addaon ◴[] No.46227341{5}[source]
Except that if you're in country B, which has a law that says "you may not make information available to children that discloses that Santa Claus is made up," and the age of becoming an adult in your country is 18 -- knowing that a person accessing your site from country A is an adult in country A (which means, say, ≥ 16) is not sufficient to comply with the law.
replies(1): >>46229133 #
17. quailfarmer ◴[] No.46229133{6}[source]
I’m not sure why the age of majority in the region of the server would be relevant. The user is not traveling to that region, the laws protecting them should be the laws in their own region.
replies(1): >>46234942 #
18. bena ◴[] No.46232037{3}[source]
Right. This isn't something we can completely solve with legislation or technology.
19. addaon ◴[] No.46234942{7}[source]
> why

> should

I don't know if "should" is intended as a moral statement or a regulatory statement, but it's not at all unusual for server operators to need to comply with laws in the country in which they are operating…

20. ars ◴[] No.46236031[source]
This exists: https://en.wikipedia.org/wiki/Platform_for_Internet_Content_...

It was derided as a "system for mass censorship", and got shot down. In hindsight a mistake, and it should have been implemented - it was completely voluntary by the user.

replies(1): >>46238061 #
21. taeric ◴[] No.46236425[source]
My only gripe here is the idea of "perhaps hold the parents culpable." I'm not opposed to the idea, but what sucks is we are ultimately all paying the cost of it going wrong. The idea that we can shunt that away to a few irresponsible people is just demonstrably not the case.

Worse, it leads to situations where society seems to want to flat out be kid free in many ways. With families reportedly afraid to let their kids walk to and from school unsupervised.

I don't know an answer, mind. So this is where I have a gripe with no real answer. :(

replies(3): >>46236539 #>>46237163 #>>46239935 #
22. awesome_dude ◴[] No.46236539[source]
Add to that, clearly those "bad parents" are the result of bad parenting in the first place, so really it's the grand parents that are to blame...

Wait, those grand parents also had bad models to work with, so really it's the great grandparents that were to blame...

No, wait, it was the society that they grew up in that encouraged poor behaviour toward them, and forced them to react by taking on toxic behaviours. We all should pay because we all actively contribute to the world around us, and that includes being silent when we see bad things happening.

23. Bender ◴[] No.46236866[source]
A server header exists to say something is adult and could be used for user-generated content as well. [1] It just needs legislation and an afternoon from interns at assorted companies. It's not perfect, nothing is but could easily trigger existing parental controls and parental controls that could be added back into user agents. No third parties required. I think I've beat this horse into dust [2] so I should just hire kvetchers to politely remind congress at this point.

[1] - https://news.ycombinator.com/item?id=46152074

[2] - https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...

replies(1): >>46237209 #
24. no_wizard ◴[] No.46237163[source]
>Worse, it leads to situations where society seems to want to flat out be kid free in many ways. With families reportedly afraid to let their kids walk to and from school unsupervised.

I'm not seeing the correlation / causation here.

replies(2): >>46237344 #>>46237526 #
25. no_wizard ◴[] No.46237209[source]
I like the first part of the idea, which is the header. Heck, even enable it by default. As long as the tracking of the toggle isn't a thing its a perfect compromise. While we're at it, respecting do not track headers would also be nice.

This completely leaves it up to the families / parents to control and gives some level of compliance to make the effort worth while.

There may even be a way to generate enough noise with the request to prevent any forms of tracking. This sort of thing should really be isolated in that way to prevent potential abuses via data brokers by way of sale of the information

replies(2): >>46237262 #>>46241686 #
26. Bender ◴[] No.46237262{3}[source]
As long as the tracking of the toggle isn't a thing its a perfect compromise.

This concept does not involve any tracking if implemented as designed. The user agent detects the RTA header and triggers parental controls if enabled. Many sites already voluntarily self label. [1] Careful how far one drills down as these sites are NSFW and some may be malicious.

[1] - https://www.shodan.io/search?query=RTA-5042-1996-1400-1577-R...

27. saltcured ◴[] No.46237344{3}[source]
Not sure, but I think the earlier post is implying a (false) dichotomy between:

A. "Your kid is not my problem"

B. "Your kid is everyone's problem"

replies(1): >>46237767 #
28. taeric ◴[] No.46237526{3}[source]
I was referencing the towns that have called the cops because there were some unsupervised kids in a park. I comfort myself by saying this isn't nearly as common as the fear mongers online would have you think. That there are cases it happens still worries me.

Note that I'm not even necessarily worried about cops getting called. Quite the contrary, I am fine with the idea of cops having a more constant presence around parks and such. I do worry about people that get up in arms about how things are too unsafe for kids to be let outside. If that is the case, what can we do to make it safe?

29. raw_anon_1111 ◴[] No.46237553{3}[source]
And surprisingly when the law makes such decisions, it seems to affect little Jerome more than little Johnny.

You have way too much faith in the fairness of the court system.

30. taeric ◴[] No.46237767{4}[source]
Less the false dichotomy, and more the stickiness of each of those options. To your point (I think), those aren't the only options available, but people do seem to be attracted quite heavily to them.
31. iamnothere ◴[] No.46238061{3}[source]
It’s close, but I see why it failed. There’s no need to include licensing/rights management in there. Also this was before pervasive HTTPS, so it would have been possible for governments and ISPs to snoop the info and possibly block it. If it could be limited to just content ratings, and kept private behind SSL, this isn’t a bad approach.

But this also needs some kind of guarantee that lawmakers won’t try to force it on FOSS projects that want to operate outside the system. And that companies like Google won’t use EEE to gradually expand this header into other areas and eventually cut off consenting adults who want to operate outside this system. I’m not sure if it is possible to get those guarantees.

32. BobaFloutist ◴[] No.46238266[source]
I think the idea is that the manufacturers are culpable for making a parental restriction mode that's set-and-forget and not easily thwarted from inside the mode and parents are culpable for declining to set it.

Which I still don't love, but is at least more fair.

33. Gormo ◴[] No.46239935[source]
The presumption that it's not a matter of the parents' prerogative whether to decide whether the child's access should be restricted or not -- and treating the parents as accountable to someone else's standards of what is or is not appropriate for their own children -- is itself objectionable.

What content is appropriate for children is properly up to their parents themselves, not to the government or to some nebulous concept of "society". If parent's choose not to set such a flag on their children's devices, then that means that they're choosing to allow their children to access content without restriction, and that's what defines what is OK for their children to access.

34. Epa095 ◴[] No.46241419[source]
It would be possible to make a website which proxies other sites, but strips this header, right (maybe with some added ads)?

If so I would expect such sites to appear, and the only way to secure a child device is to have a whitelist of webpages (to avoid the proxies), putting us back close to where we are today.

replies(1): >>46241628 #
35. cocoto ◴[] No.46241628[source]
Such sites would be illegal if not sharing the header back from the source website and be banned as much as adult websites incorrectly setting this header. It’s not a real problem.
replies(1): >>46241810 #
36. yardstick ◴[] No.46241686{3}[source]
They tried this years ago with ICRA and others. Long gone, but it worked by the webmaster adding metadata to self classify a site. Browsers and other agents could then allow/deny after inspecting the tags.

https://sk.sagepub.com/ency/edvol/childmedia/chpt/internet-c...

37. Epa095 ◴[] No.46241810{3}[source]
Making them illegal does not fix it. There will be a indefinite whack-a-mole game which is very hard to solve without draconian control over the Internet.

The problem is that it's easy to make, easy to deploy, easy to make money on, and a single site opens up the whole Internet. It will happen even if it's illegal.

Compare this to adult webpages setting the header. They will probably be quite willing to do so, since they want to make their money legally, and there is probably little money in serving to kids anyway. And even if a single out of thousand adult webpages refuses, it still only opens that single site.

replies(1): >>46242402 #
38. imtringued ◴[] No.46242402{4}[source]
It's actually hard to understand on "which" side you're on, but a charitable interpretation is that you're arguing that there are no perfect solutions, hence a simple and minimal non-invasive method will probably have the same effect as a complex and invasive method. That is, both methods will add enough friction that children who don't know what they're missing won't bother and the ones who can't do without, will choose every conceivable method to get around the restrictions.

Worrying about the latter makes no sense, because they are sort of like organized crime. People still take drugs even though they are illegal.