←back to thread

361 points gloxkiqcza | 1 comments | | HN request time: 0.205s | source
Show context
torginus ◴[] No.45011561[source]
I genuinely do not understand where how the idea of building a total surveillance police state, where all speech is monitored, can even as much as seriously be considered by an allegedly pro-democracy, pro-human rights government, much less make it into law.

Also:

Step 1: Build mass surveillance to prevent the 'bad guys' from coming into political power (its ok, we're the good guys).

Step 2: Your political opponents capitalize on your genuinely horrific overreach, and legitimize themselves in the eyes of the public as fighting against tyranny (unfortunately for you they do have a point). They promise to dismantle the system if coming to power.

Step 3: They get elected.

Step 4: They don't dismantle the system, now the people you planned to use the system against are using it against you.

Sounds brilliant, lets do this.

replies(17): >>45011763 #>>45011799 #>>45011932 #>>45012205 #>>45012358 #>>45012512 #>>45012976 #>>45013249 #>>45013303 #>>45013857 #>>45014035 #>>45014477 #>>45014527 #>>45014559 #>>45016358 #>>45020627 #>>45021408 #
IanCal ◴[] No.45012976[source]
I'm not a fan of the OSA but proponents of it will *keep winning* if you *keep misrepresenting it*.

You can, and should, argue about the effects but the core of the OSA and how it can be sold is this, at several different levels:

One, most detailed.

Sites that provide user to user services have some level of duty of care to their users, like physical sites and events.

They should do risk assessments to see if their users are at risk of getting harmed, like physical sites and events.

They should implement mitigations based on those risk assessments. Not to completely remove all possibility of harm, but to lower it.

For example, sites where kids can talk to each other in private chats should have ways of kids reporting adults and moderators to review those reports. Sites where you can share pictures should check for people sharing child porn (if you have a way of a userbase sharing encrypted images with each other anonymously, you're going to get child porn on there). Sites aimed at adults with public conversations like some hobby site with no history of issues and someone checking for spam/etc doesn't need to do much.

You should re-check things once a year.

That's the selling point - and as much as we can argue about second order effects (like having a list of IDs and what you've watched, overhead etc), those statements don't on the face of it seem objectionable.

Two, shorter.

Sites should be responsible about what they do just like shops and other spaces, with risk assessments and more focus when there are kids involved.

Three, shortest.

Facebook should make sure people aren't grooming your kids.

Now, the problem with talking about " a total surveillance police state, where all speech is monitored," is where does that fit into the explanations above? How do you explain that to even me, a highly technical, terminally online nerd who has read at least a decent chunk of the actual OFCOM guidelines?

replies(5): >>45013874 #>>45014156 #>>45014350 #>>45014492 #>>45015325 #
DonaldFisk ◴[] No.45014492[source]
This isn't a new issue, and it predates the internet. There were publishers of magazines containing pornography (or anything else unsuitable for children). These were sold in shops. A publisher had to ensure that the material in the magazines was legal to print, but it wasn't their responsibility to prevent children from looking at their magazines, and it's difficult to see how that would even be possible. That was the responsibility of the people working in the shops: they had to put the magazines on the top shelf, and weren't allowed to sell them to children.

On the internet, people don't get porn videos directly from pornographic web sites, just as in the past they didn't buy porn directly from the publishers. The videos are split up into packets, and transmitted through an ad hoc chain of servers until it arrives, via their ISP, on their computer. The web sites are the equivalent of the publishers, and ISPs are the equivalent of the shops. So it would make a lot more sense to apply controls at the ISPs. And British ISPs are within the UK's jurisdiction.

And before anyone points out that there are workarounds that children could use to bypass controls, this was also the case with printed magazines.

replies(2): >>45014730 #>>45015509 #
1. IanCal ◴[] No.45015509[source]
> but it wasn't their responsibility to prevent children from looking at their magazines

They weren't made to guarantee no child could peek at them, no, but they do have age restrictions that are followed (a child who picks one up couldn't buy it) and they were often on the top shelf. The kind of thing a basic risk assessment would flag "hey we keep the hardcore porn in front of the pokemon magazines...".

> The videos are split up into packets, and transmitted through an ad hoc chain of servers until it arrives, via their ISP, on their computer. The web sites are the equivalent of the publishers, and ISPs are the equivalent of the shops

The pictures emit photons which fly through the air to the child. The air is the shop.

Or for websites your computer is the shop.

The ISP is not the shop. Nor in the OSA is it viewed as such. The company who makes the service has some responsibility.

> So it would make a lot more sense to apply controls at the ISPs.

This fundamentally cannot work for what is in the OSA, and if you cannot see why almost immediately then you do not know what is in the OSA and cannot effectively argue against it. It is not a requirement to add age checks to porno sites.