https://docs.github.com/en/authentication?query=public+key+u...
https://docs.github.com/en/authentication?query=public+key+u...
From your quote around "public", I presume you think there is some sense in which they're not really public? They are and should ALWAYS be considered PUBLIC. If you find yourself ever crafting a security solution where public keys somehow need to be private or secret, go back to the drawing board or reach out to someone with serious expertise.
There are cases where information on a certificate (which is associated with a public key)may indeed need to be protected, in that case you need to implement an information mask (via hashing) that can protect the private information, we had to do something similar with Certisfy.com certificates. But public keys should be considered public without exceptions.
I know you’re taking the “strict teacher” approach with your comment, but you’re totally wrong. And the reason you’re wrong is, security doesn’t equal privacy. But for the “average person,” security does equal privacy, or should, so they find systems that could potentially expose their identity to be “insecure.”
In this particular case, there have been past examples of using keys to fingerprint users without their consent. Yes, it’s been super edge-case and proof-of-concept, but for a lot of people — and perhaps more importantly, in a lot of jurisdictions — leaving a personal identifier sitting around like this (without ever informing the user!) is the very opposite of a best practice.
The end result is, you should only have a key on GitHub that isn’t used anywhere else. That’s what I do, and I’m sure lots of us on this comment thread do, but there’s definitely lots of My First Coding Bootcamp people who were guided through their GitHub account installations who might not have been aware that these are keys that shouldn’t be reused elsewhere.
I would have a very different view on this if GitHub had been explicit about the use of registered keys for other services. That’s a GREAT concept, but I’m not going to trust a company with that business when they’ve just backdoored themselves into it without asking for permission. And the problem for them is, in this particular situation you need the weird paranoid privacy crowd on your side for it to work.
Your SSH public key is really the least of your identifiable information you’d be worried about because that’s the easiest to create a unique key for GitHub.
I disagree. The article points to simplifications that SSH did compared to GPG, like no web of trust.
But there's now something very close to one, a bit hidden though: SSHFP + DNSSEC
I think a lot of people here may not be familiar with SSHFP records, so check https://fanf.livejournal.com/130577.html for a more detailed explanation.
It's more or less a work around the problem of initially trusting a key, by using SSHFP records to verify that the server keys are validated by the domain, with DNSSEC validating the whole thing.
When you can match server keys to domains and user keys to github users, you are just one link away from having a global picture of the web of trust.
EDIT: and I wonder how feasible it would be to do just that with a sniff of the initial handshake
> because that’s the easiest to create a unique key for GitHub
And how many people do you think have done that, instead of uploading the keys of the servers/laptops/etc. they use?
I hate to say this, but Bitcoin did it right: by having a discardable private+public pair of keys derived from a seed key, it prevents leaks of information by correlation based on the public key.
Equally they might use the same user name as on other services, thus making them traceable.
Ultimately if you care about privacy then it’s up to you, the individual, not to share that information or to provide anonymous information yourself.
Sharing identifiable information and then blaming the company you shared it with is like blaming the horse for bolting after you’ve intentionally left the barn door open. Sure it should be that way but ultimately it’s your own responsibility to keep your stuff safe.
I’d agree with your point more if you were advocating that documentation should be more explicit. Ie helping educate users into making smarter choices.