Most active commenters
  • vandalism(3)

←back to thread

61 points vandalism | 24 comments | | HN request time: 0.579s | source | bottom
1. sneak ◴[] No.45154582[source]
The entitlement of application authors to do whatever the fuck they want on your machine is astounding to me.

Root CAs, background processes 24/7, uploading of the full process list, clipboard spying, local network scanning, surveillance (aka telemetry) - when did developers decide that our machines aren’t ours anymore?

replies(5): >>45154600 #>>45154605 #>>45154643 #>>45154652 #>>45154741 #
2. Bluecobra ◴[] No.45154600[source]
This appears to be a server emulator for the defunct MMO Need for Speed World. My guess is that need they need to spoof the TLS certs and install local host entries to get the original game client to work.
replies(2): >>45154630 #>>45154657 #
3. vandalism ◴[] No.45154630[source]
The certificate is used for nothing more other than checking whether the launcher is "signed". The whole scheme is full of security holes, the certificate check mostly seems like it was a programming exercise for the author.

There is no need for the certificate installation with regards to any emulation functioning. Also, worth noting that this is an ongoing issue: this reboot of the game still has a decent daily player count and the CA installation concern has not been addressed, the launcher still does this.

(It's also not a server emulator, it's just a launcher for the game client, used by players of the game.)

replies(1): >>45154902 #
4. guessmyname ◴[] No.45154643[source]
Add to the list exfiltration of $ENV (environment variables), which often include secret keys and app tokens. I have seen many young developers expose their $ENV on GitHub when other developers asks them to share their “go env”, or similar commands, while debugging a problem.
5. askvictor ◴[] No.45154652[source]
The alternative being a walled garden like Apple or (increasingly) Android, where they don't have access to anything (at least without a prompt asking if you grant said permission). If you run a system that lets you do what you want to it, you need to accept that others might try to do what they want to it, too.
replies(2): >>45154838 #>>45163779 #
6. ◴[] No.45154657[source]
7. diath ◴[] No.45154741[source]
It would be nice if desktop software had to explicitly request access to different APIs on the system (network, filesystem, etc) as well as only request access to specific filesystem paths, then give us prompts that list the permissions that the app wants. Something like pledge (https://man.openbsd.org/pledge.2) from OpenBSD/Serenity but integrated into the desktop systems GUI.
replies(2): >>45154762 #>>45154817 #
8. VoidWhisperer ◴[] No.45154744[source]
The work being OSS and done free of charge doesn't excuse them from putting their users at unnecessary risk, especially when it is done so with only a one line mention in their github README and no mention on their website, which doesn't point towards the README at all
replies(1): >>45155045 #
9. drodgers ◴[] No.45154762[source]
MacOS has been moving more and more in this direction, and it’s good.
10. to11mtm ◴[] No.45154817[source]
That would indeed be very nice, compared to the current standards out there for desktops...

Ironically, I -think- UWP tried to 'solve' this in some ways but OTOH adds new problems instead...

I also know Microsoft had a different idea when it came to .NET before core, where libraries could be run in 'Partial trust' but with 'Link Demands'... And I've never seen a shop actually do that right vs just YOLOing with 'full trust' and/or abuse of AllowPartiallyTrustedCallersAttribute...

Which I guess is a roundabout way of saying I feel like Microsoft has tried twice but completely lost the plot early on and failed to deliver a usable product (What even is the state of UWP questionmark, and .NET Code Access Security was given up in Core....)

11. 01HNNWZ0MV43FF ◴[] No.45154838[source]
Prompts are completely fine. I am happy with the prompts GrapheneOS offers me
12. reactordev ◴[] No.45154902{3}[source]
Codesigning is expensive. You have to purchase a $500 cert and renew it every year. Or, you can issue your own CA capable of code signing and sign your own stuff. But the OS won't think it's really signed unless the OS also has the CA in it's trust store.

This is just a case of them wanting to save money on code-signing certificate renewal fees.

replies(2): >>45155371 #>>45157300 #
13. ◴[] No.45155026[source]
14. chmod775 ◴[] No.45155045{3}[source]
It should not, but they still don't owe it to you or anyone to change anything.

You're not paying them. There's no transaction. They're not even giving the software specifically to you, rather they're saying "this is free for anyone to pick up" - with no warranty of any kind.

When you pick up some free furniture from the roadside, it's on you to determine whether it meets your safety standards. If the free table you picked up has some defect, you most certainly don't ring someone's doorbell and demand rectification.

replies(2): >>45155077 #>>45155181 #
15. xvector ◴[] No.45155055[source]
No. Ethics in engineering exists. They have a moral responsibility to not install a root cert on unsuspecting users' machines.

I can build a bridge free of charge, optional to use, that doesn't mean it's not my responsibility to ensure its safety.

16. benreesman ◴[] No.45155077{4}[source]
Nah, distributing rootkits under false pretenses is a dick move.

That's not even a little controversaial. You put a thing on the web that says "Just a harmless XYZ" and it roots TLS forever?

Malware. Black and white.

17. vandalism ◴[] No.45155181{4}[source]
This assumes that all users are informed enough to make such decisions.

You cannot expect the average player of an online game to have the technical knowledge necessary to discern whether a piece of software is safe to use or not. Even if you could, you'd also be expecting them to take the time to do a proper analysis of such software, which I do not think is a reasonable premise.

What's more, this is open-source software we're talking about and you can actually relatively easily perform meaningful security checks; imagine if this were not the case.

18. hamandcheese ◴[] No.45155392[source]
If I gave away free brownies that happened to be poison, but I really didn't mean to, I still probably should be held liable in some way.

If I was giving away free brownies, and someone kindly informed me that they were poison, and I continued to give them away, I belong in prison.

Edit: it seems like there's been no activity in the repo since before the issue was filed, so it's hard to say if the author can be considered to have been informed.

replies(1): >>45155477 #
19. vandalism ◴[] No.45155477{3}[source]
There has been a new GitHub release in April of this year, however, it seems to have been made by a member of the community along with the commit it includes, instead of the original creator.

Edit: There seems to be activity on the author's account which points to the conclusion that they are aware of the issue and are making (still at least somewhat questionable) changes for a new (unreleased?) version of the launcher to address the problem.

https://github.com/Zacam/SBRW.Launcher.Net/commit/f09d911fca...

As far as I am aware the launcher repo I linked in the original post is still the main launcher players use for the game, meaning people are still getting the certificate permanently installed.

20. calcifer ◴[] No.45155911{5}[source]
> criminal negligence

Can we stop with this kind of hyperbole, please? It's an open-source project for a dead game. It does not come pre-installed with any hardware, nor is it required by any employer or government to be installed on your device. It's something you actively have to seek and install, and not even the person reporting the bug saw anything malicious happening.

Criminal negligence is a legal term with a specific meaning, and it is far removed from... whatever you think is happening here.

replies(1): >>45157288 #
21. jdjdhdbdndbsb ◴[] No.45157288{6}[source]
Can you think a little bigger about the implications here?? Please understand the root key for this cert has absolute mother fuckton of power ... Someone who has this key can sign certs and pretend to be your bank, your crypto provider, anything you visit!!!!

You need to understand that a root ca key is generally stored offline , in shamir secret sharing pieces, likely in some vaults... if this dude is just keeping this on his computer with a shitty router in front of it, they are being criminally negligent.

This isn't hyperbole.

Edit: missed a word

replies(1): >>45157723 #
22. dextercd ◴[] No.45157300{4}[source]
A code signing certificate does not cost $500 a year. The OP links to an offering by Certum which is just $25 a year plus the cost for a reusable smart card.

Personally, I recently acquired a certificate from HARICA which costs $55 a year if you only buy one year at a time.

23. reactordev ◴[] No.45157723{7}[source]
Except this is just a single validation root ca, not a wildcard across the whole internet CA. I agree that this is complete hyperbole and everyone is making a fuss about nothing.

To remind the viewers, in order for a certificate to be considered “valid”, at least an intermediate CA (certificate authority) certificate needs to be trusted by the OS. At work, we do this. When I release games, I do this. I give you my CA, so you can verify and guarantee my software was written by me, my org, and hasn’t been altered.

I get the perspective of letting end users know, but I don’t agree with giving them a choice.

The same intermediate CA is used by us for encryption of communications as well. So, we want to remove that? Make everything plain text binary? No. Get over yourself.

replies(1): >>45164777 #
24. jdjdhdbdndbsb ◴[] No.45164777{8}[source]
So I take it you didn't read the github link where the poster says that the CA has too many many permissions including server and client authentication? No?

So its not hyperbole.

Evidence verbatim from GH post:

However, even if this is in fact a well-intentioned bad execution of the code signature verification idea and not malicious in any way, it is still a pretty egregious security issue for the users of SBRW. For what it's worth, also consider the case wherein the private keys for the CA are stolen in some way from whomever currently has them.

I also want to note that the certificate has a highly inappropriate and unnecessarily broad list of key usage IDs included, of which I would assume that no more than two or three are necessary for the advertised function of this certificate. The complete list follows:

List Server Authentication (1.3.6.1.5.5.7.3.1) Client Authentication (1.3.6.1.5.5.7.3.2) Code Signing (1.3.6.1.5.5.7.3.3) Secure Email (1.3.6.1.5.5.7.3.4) Time Stamping (1.3.6.1.5.5.7.3.8) Unknown Key Usage (1.3.6.1.4.1.311.2.1.21) Unknown Key Usage (1.3.6.1.4.1.311.2.1.22) Microsoft Trust List Signing (1.3.6.1.4.1.311.10.3.1) Unknown Key Usage (1.3.6.1.4.1.311.10.3.3) Encrypting File System (1.3.6.1.4.1.311.10.3.4) Unknown Key Usage (2.16.840.1.113730.4.1) File Recovery (1.3.6.1.4.1.311.10.3.4.1) IP security end system (1.3.6.1.5.5.7.3.5) IP security tunnel termination (1.3.6.1.5.5.7.3.6) IP security user (1.3.6.1.5.5.7.3.7) IP security IKE intermediate (1.3.6.1.5.5.8.2.2) Smart Card Logon (1.3.6.1.4.1.311.20.2.2) OCSP Signing (1.3.6.1.5.5.7.3.9) Unknown Key Usage (1.3.6.1.5.5.7.3.13) Unknown Key Usage (1.3.6.1.5.5.7.3.14) KDC Authentication (1.3.6.1.5.2.3.5)