Most active commenters
  • immibis(7)
  • johncolanduoni(7)
  • samuel(4)

←back to thread

Stop Breaking TLS

(www.markround.com)
170 points todsacerdoti | 22 comments | | HN request time: 1.06s | source | bottom
Show context
samuel ◴[] No.46215799[source]
I agree with the sentiment, but I think it's a pretty naive view of the issue. Companies will want all info they can in case some of their workers does something illegal-inappropiate to deflect the blame. That's a much more palpable risk than "local CA certificates being compromised or something like that.

And some of the arguments are just very easily dismissed. You don't want your employer to see you medical records? Why were you browsing them during work hours and using your employers' device in the first place?

replies(3): >>46215855 #>>46216169 #>>46216703 #
1. immibis ◴[] No.46215855[source]
In Europe they prefer not to go to jail for privacy violations. It turns out most of these "communist" regulations are actually pretty great.
replies(1): >>46215994 #
2. johncolanduoni ◴[] No.46215994[source]
Does GDPR (or similar) establish privacy rights to an employee’s use of a company-owned machine against snooping by their employer? Honest question, I hadn’t heard of that angle. Can employers not install EDR on company-owned machines for EU employees?
replies(5): >>46216082 #>>46216180 #>>46216380 #>>46216557 #>>46218221 #
3. apexalpha ◴[] No.46216082[source]
Yes, at least in the Netherlands it is generally accepted that employees can use your device personally, too.

Using a device owned by your company to access your personal GMail account does NOT void your legal right to privacy.

replies(1): >>46216551 #
4. zeeZ ◴[] No.46216180[source]
They can, but the list of "if..." and "it depends..." is much longer and complicated, especially when getting to the part how the obtained information may be used
5. Msurrow ◴[] No.46216380[source]
Yes. GDPR covers all handling of PII that a company does. And its sort of default deny, meaning that a company is not allowed to handle (process and/or store) your data UNLESS it has a reason that makes it legal. This is where it becomes more blurry: figuring out if the company has a valid reason. Some are simple, eg. if required by law => valid reason.

GDPR does not care how the data got “in the hands of” the company; the same rules apply. Another important thing is the pricipals of GDPR. They sort of unline everything. One principal to consider here is that of data minimization. This basically means that IF you have a valid reason to handle an individuals PII, you must limit the data points you handle to exactly what you need and not more.

So - company proxy breaking TLS and logging everything? Well, the company has valid reason to handle some employee data obviously. But if I use my work laptop to access privat health records, then that is very much outside the scope of what my company is allowed handle. And logging (storing) my health data without valid reason is not GDPR compliant.

Could the company fire me for doing private stuff on a work laptop? Yes probably. Does it matter in terms of GDPR? Nope.

Edit: Also, “automatic” or “implicit” consent is not valid. So the company cannot say something like “if you access private info on you work pc the you automatically content to $company handling your data”. All consent must be specific, explicit and retractable

replies(1): >>46216537 #
6. johncolanduoni ◴[] No.46216537{3}[source]
What if your employer says “don’t access your health records on our machine”? If you put private health information in your Twitter bio, Twitter is not obligated to suddenly treat it as if they were collecting private health information. Otherwise every single user-provided field would be maximally radioactive under GDPR.
replies(3): >>46216618 #>>46218243 #>>46222987 #
7. johncolanduoni ◴[] No.46216551{3}[source]
So does nobody in Europe use an EDR or intercepting proxy since GDPR went into force?
replies(2): >>46217001 #>>46229843 #
8. samuel ◴[] No.46216557[source]
(IANAL) I don't think there is a simple response to that, but I guess that given that the employer:

- has established a detailed policy about personal use of corporate devices

- makes a fair attempt to block work unrelated services (hotmail, gmail, netflix)

- ensures the security of the monitored data and deletes it after a reasonable period (such as 6–12 months)

- and uses it only to apply cybersecurity-related measures like virus detection, UNLESS there is a legitimate reason to target a particular employee (legal inquiry, misconduct, etc.)

I would say that it's very much doable.

Edit: More info from the Dutch regulator https://english.ncsc.nl/publications/factsheets/2019/juni/01...

9. Msurrow ◴[] No.46216618{4}[source]
If the employer says so and I do so anyway then that’s a employment issue. I still have to follow company rules. But the point is that the company needs to delete the collected data as soon as possible. They are still not allowed to store it.
replies(1): >>46222518 #
10. samuel ◴[] No.46217001{4}[source]
I have found a definite answer from the Dutch Protection Agency (although it could be out of date).

https://english.ncsc.nl/binaries/ncsc-en/documenten/factshee...

replies(1): >>46222315 #
11. immibis ◴[] No.46218221[source]
It has to have a good purpose. Obviously there are a lot of words written about what constitutes a good purpose. Antivirus is probably one. Wanting to intimidate your employees is not. The same thing applies to security cameras.

Privacy laws are about the end-to-end process, not technical implementation. It's not "You can't MITM TLS" - it's more like "You can't spy on your employees". Blocking viruses is not spying on your employees. If you take the logs from the virus blocker and use them to spy on your employees, then you are spying on your employees. (Virus blockers aiming to be sold in the EU would do well not to keep unnecessary logs that could be used to spy on employees.)

12. immibis ◴[] No.46218243{4}[source]
Many programmers tend to treat the legal system as if it was a computer program: if(form.is_public && form.contains(private_health_records)) move(form.owner, get_nearest_jail()); - but this is not how the legal system actually works. Not even in excessively-bureaucratic-and-wording-of-rules-based Germany.
replies(1): >>46222388 #
13. johncolanduoni ◴[] No.46222315{5}[source]
What’s the definitive answer? From what I can tell that document is mostly about security risks and only mentions privacy compliance in a single paragraph (with no specific guidance). It definitely doesn’t say you can or can’t use one.
replies(2): >>46222484 #>>46228209 #
14. johncolanduoni ◴[] No.46222388{5}[source]
Yeah, that’s my point. I don’t understand why the fact that you could access a bunch of personal data via your work laptop in express violation of the laptop owner’s wishes would mean that your company has the same responsibilities to protect it that your doctor’s office does. That’s definitely not how it works in general.
replies(1): >>46222952 #
15. immibis ◴[] No.46222484{6}[source]
That's probably because there is no answer. Many laws apply to the total thing you are creating end-to-end.

Even the most basic law like "do not murder" is not "do not pull gun triggers" and a gun's technical reference manual would only be able to give you a vague statement like "Be aware of local laws before activating the device."

Legal privacy is not about whether you intercept TLS or not; it's about whether someone is spying on you, which is an end-to-end operation. Should someone be found to be spying on you, then you can go to court and they will decide who has to pay the price for that. And that decision can be based on things like whether some intermediary network has made poor security decisions.

This is why corporations do bullshit security by the way. When we on HN say "it's for liability reasons" this is what it means - it means when a court is looking at who caused a data breach, your company will have plausible deniability. "Your Honour, we use the latest security system from CrowdStrike" sounds better than "Your Honour, we run an unpatched Unix system from 1995 and don't connect it to the Internet" even though us engineers know the latter is probably more secure against today's most common attacks.

replies(1): >>46222638 #
16. johncolanduoni ◴[] No.46222518{5}[source]
I’ll give an example in more familiar with. In the US, HIPPA has a bunch of rules about how private health information can be handled by everyone in the supply chain, from doctor’s offices to medical record SaaS systems. But if I’m running a SaaS note taking app and some doctor’s office puts PHI in there without an express contract with me saying they could, I’m not suddenly subject to enforcement. It all falls on them.

I’m trying to understand the GDPR equivalent of this, which seems to exist since every text fields in a database does not appear to require the full PII treatment in practice (and that would be kind of insane).

17. johncolanduoni ◴[] No.46222638{7}[source]
Okay, thanks for explaining the general concept of law to me, but this provides literally no information to figure out the conditions under which an employer using a TLS intercepting proxy to snoop on the internet traffic a work laptop violates GDPR. I never asked for a definitive answer just, you know, an answer that is remotely relevant to the question.

I don’t really need to know, but a bunch of people seemed really confident they knew the answer and then provided no actual information except vague gesticulation about PII.

replies(1): >>46236465 #
18. immibis ◴[] No.46222952{6}[source]
The legal default assumption seems to be that you can use your work laptop for personal things that don't interfere with your work. Because that's a normal thing people do.
19. immibis ◴[] No.46222987{4}[source]
I suspect they should say "this machine is not confidential" and have good reasons for that - you can't just impose extra restrictions on your employees just because you want to.

The law (as executed) will weigh the normal interest in employee privacy, versus your legitimate interest in doing whatever you want to do on their computers. Antivirus is probably okay, even if it involves TLS interception. Having a human watch all the traffic is probably not, even if you didn't have to intercept TLS. Unless you work for the BND (German Mossad) maybe? They'd have a good reason to watch traffic like a hawk. It's all about balancing and the law is never as clear-cut as programmers want, so we might as well get used to it being this way.

20. samuel ◴[] No.46228209{6}[source]
Your question So does nobody in Europe use an EDR or intercepting proxy since GDPR went into force?

Given that a regulator publishes a document with guidelines about DPI I think it rules out the impossibility of implementing it. If that were the case it would simply say "it's not legal". It's true that it doesn't explicitly say all the conditions you should met, but that wasn't your question.

21. apexalpha ◴[] No.46229843{4}[source]
You can do it but you'd have to have a good case for it to trump the right to privacy.

It's not as simple as in the US where companies consider everything on company device their property even if employees use it privately.

22. immibis ◴[] No.46236465{8}[source]
Are they using it to snoop on the traffic, or are they merely using it to block viruses? Lack of encryption is not a guarantee of snooping. I know in the USA it can be assumed that you can do whatever you want with unencrypted traffic, which guarantees that if your traffic is unencrypted, someone is snooping on it. In Europe, this might not fly outside of three-letter agencies (who you should still be scared of, but they are not your employer).