https://docs.github.com/en/authentication?query=public+key+u...
https://docs.github.com/en/authentication?query=public+key+u...
From your quote around "public", I presume you think there is some sense in which they're not really public? They are and should ALWAYS be considered PUBLIC. If you find yourself ever crafting a security solution where public keys somehow need to be private or secret, go back to the drawing board or reach out to someone with serious expertise.
There are cases where information on a certificate (which is associated with a public key)may indeed need to be protected, in that case you need to implement an information mask (via hashing) that can protect the private information, we had to do something similar with Certisfy.com certificates. But public keys should be considered public without exceptions.
I know you’re taking the “strict teacher” approach with your comment, but you’re totally wrong. And the reason you’re wrong is, security doesn’t equal privacy. But for the “average person,” security does equal privacy, or should, so they find systems that could potentially expose their identity to be “insecure.”
In this particular case, there have been past examples of using keys to fingerprint users without their consent. Yes, it’s been super edge-case and proof-of-concept, but for a lot of people — and perhaps more importantly, in a lot of jurisdictions — leaving a personal identifier sitting around like this (without ever informing the user!) is the very opposite of a best practice.
The end result is, you should only have a key on GitHub that isn’t used anywhere else. That’s what I do, and I’m sure lots of us on this comment thread do, but there’s definitely lots of My First Coding Bootcamp people who were guided through their GitHub account installations who might not have been aware that these are keys that shouldn’t be reused elsewhere.
I would have a very different view on this if GitHub had been explicit about the use of registered keys for other services. That’s a GREAT concept, but I’m not going to trust a company with that business when they’ve just backdoored themselves into it without asking for permission. And the problem for them is, in this particular situation you need the weird paranoid privacy crowd on your side for it to work.
Your SSH public key is really the least of your identifiable information you’d be worried about because that’s the easiest to create a unique key for GitHub.
Nonsense. Would you say the same thing about a password? Would you make the same comment about a conversation over a messaging service? This is a configuration detail of an account setting -- not a blog post.
Privacy is not binary; there are many shades of grey. It is surprising that this is made public and while it is not necessarily wrong to provide this service (I'm fine with it; I see the utility) it is also reasonable to ask for a way to opt out.
Passwords are secrets. Public keys are not. So the comparison doesn’t work.
> Would you make the same comment about a conversation over a messaging service?
If it was a public messaging service like HN, or public comments on Twitter or Facebook, then yes.
> This is a configuration detail of an account setting -- not a blog post.
We could be here all day and night saying what this is or isn’t but it doesn’t address the point I was making. The moment you create a GitHub account you start leaking far more sensitive data than your public keys. Data that is far harder to create anonymously (unlike your SSH keys). Thus if privacy is a concern then you shouldn’t be using GitHub in the first place. Even git version control itself leaks information about you.
> Privacy is not binary; there are many shades of grey.
Ironic you state that when you’re the one applying privacy in a binary way. I’m saying the SSH public keys are a lower risk than other details you share in GitHub. Not that it’s a zero or 100% bad thing, which is the only pidgin holes you’re allowing for this discussion.
> It is surprising that this is made public and while it is not necessarily wrong to provide this service (I'm fine with it; I see the utility) it is also reasonable to ask for a way to opt out.
That’s literally the point I’ve been making.
The parents quote I think is illustrative
> But for the “average person,” security does equal privacy, or should, so they find systems that could potentially expose their identity to be “insecure.”
And I’m not trying to pile on here, because I sympathize with that sentiment of “should”. But I think the two issues that make that never a real possibility are 1) privacy is actually a harder problem to solve than security and 2) companies aren’t incentivized to provide privacy.
Anything you provide online should by default be assumed to end up public, and as much as we might not want that, we all really need to assume that.