Most active commenters
  • ndiscussion(6)
  • corty(5)
  • morelisp(4)
  • tptacek(3)

←back to thread

242 points raybb | 23 comments | | HN request time: 1.707s | source | bottom
Show context
ndiscussion ◴[] No.26715675[source]
It's been like this for a while, and the project owner's attitude is pretty negative overall. I do use signal daily, but I believe it's likely compromised ala lavabit.
replies(4): >>26715714 #>>26715934 #>>26716233 #>>26718058 #
1. morelisp ◴[] No.26715714[source]
What's in the Signal server to be compromised?
replies(2): >>26715770 #>>26716093 #
2. corty ◴[] No.26715770[source]
List of phone numbers? Pairs of communication partners? Timing and size of messages? Metadata about transferred media? There is still a lot, sufficient for targeting a drone strike as the usual wisdom goes.
replies(3): >>26715815 #>>26716325 #>>26716566 #
3. tptacek ◴[] No.26715815[source]
Some of that information you don't even need a backdoor to collect; the rest is stored in plaintext by Signal's competitors.
replies(1): >>26716518 #
4. ndiscussion ◴[] No.26716093[source]
If you use the Signal app from the app stores, and communicate with the server, you are using 100% closed source software.

They could easily add a backdoor in the client despite the fact that it's "open source", because no one builds it from source.

replies(3): >>26716277 #>>26716307 #>>26716329 #
5. morelisp ◴[] No.26716277[source]
Are Signal's Android builds no longer reproducible?
replies(1): >>26716710 #
6. mdaniel ◴[] No.26716307[source]
"No one" is a bit harsh; I even helped a poster in r/Signal set up a CircleCI build for the repo in order to show that it's not oppressively hard, just tedious (as with all things CI/CD)

The Signal android build now uses some PKCS11 machinery that requires patching out to build without using a smartcard, but otherwise it works as expected.

I dove into this darkness while trying to fix the borked MMS handling on Visible (a Verizon MVNO), and is the reason I'm generally with you: if someone can't build the project, then it's not effectively open source, IMHO, because I lose my "right to repair"

7. ViViDboarder ◴[] No.26716325[source]
Signal doesn’t store lists of phone governments have lists of phone numbers. Comunication partners are hidden from the server using Sealed Sender for many conversations.

The rest of this could possibly be obtained, it it wouldn’t require a patch to the server as message sizes and timestamps likely appear on disk somewhere. Though the data is encrypted, you could tell “x received a message from some party (sealed sender prevents knowing who) at y time of roughly z size”.

replies(1): >>26716606 #
8. Caligatio ◴[] No.26716329[source]
By this standard, there is practically nothing that qualifies as open source. Compile something yourself? Well can you really trust your compiler unless you compiled it? How do you compile your compiler without a compiler? Obviously this is possible but no one does it; therefore no software is truly open source.
replies(1): >>26716692 #
9. corty ◴[] No.26716518{3}[source]
Signal claims to specially protect some of that data, such claims need verification. Storing or not storing that data needs verification, without the trust that they do what they say they are no better than their competition. Trust is earned e.g. by openness about the source code. And that a server backdoor isn't strictly necessary is also beside the point because the server is the easiest and most obvious way to get at all that data.

Also, there is competition like Briar which has less of those pesky metadata problems (but some other problems instead)

replies(1): >>26718158 #
10. pvarangot ◴[] No.26716566[source]
Being able to hide from a government that wants to drone you while still being in the cellphone network requires much much much more OPSEC than just using Signal. For an average user Signal is about protecting the content of your messages, not your network, and it's good at that.
replies(1): >>26716657 #
11. corty ◴[] No.26716606{3}[source]
Signal still uses and verifies phone numbers, so at some point they will pass through their infrastructure. They could still save them, knowing the source code they use gives at least at hint that they don't.

Sealed sender also is based on the pinky-swear that the infrastructure distributing the sender auth certificates doesn't correlate identities and connections with the messaging infrastructure. And that the server receiving the enveloped messages doesn't log. So all based on trust based on believing the right source code is running somewhere.

When access to that source code is restricted suddenly, of course people are worried.

12. corty ◴[] No.26716657{3}[source]
Yes, that "drone strike" thing is actually a stupid saying. I'm sorry to have used it because it is somewhat distracting from the actual points.
13. ndiscussion ◴[] No.26716692{3}[source]
I disagree that these are on the same level - compiling something yourself, or having something compiled by ie the Arch Linux maintainers requires a number of people to comply.

The app store is a single point of failure with huge reach.

14. ndiscussion ◴[] No.26716710{3}[source]
It looks like they are, but there might be a minor issue in verifying the content: https://github.com/signalapp/Signal-Android/issues/10476

But despite best efforts by the community to verify builds, Google and Apple can be forced to upload a malicious app to a particular user, meaning they aren't using the same app at all.

replies(2): >>26717259 #>>26717290 #
15. morelisp ◴[] No.26717259{4}[source]
If your threat model includes the ability to force Apple to do X, then Signal is irrelevant.
replies(1): >>26718003 #
16. greysonp ◴[] No.26717290{4}[source]
> But despite best efforts by the community to verify builds, Google and Apple can be forced to upload a malicious app to a particular user, meaning they aren't using the same app at all.

Hi there! Signal-Android developer here. App signing verification is done at the OS-level, and Google does not have our signing key, so they wouldn't be able to give an existing user a different APK and have it successfully install.

replies(1): >>26717997 #
17. ndiscussion ◴[] No.26717997{5}[source]
Is that really true? Couldn't Google forcibly turn off the code-signing requirement on an individual's phone?

They've been known to reset passwords remotely in the past: https://www.theverge.com/2016/3/30/11330892/fbi-google-andro...

replies(1): >>26718205 #
18. ndiscussion ◴[] No.26718003{5}[source]
That's probably a good point, I'm using GrapheneOS which is not identifiable to Google/Apple and can't be singled out for updates.
19. tptacek ◴[] No.26718158{4}[source]
I don't recall Signal ever having made implausible claims about traffic analytic attacks. I also don't buy into the idea that platforms are as trustworthy as their source release policies are orthodox.
replies(1): >>26718465 #
20. codethief ◴[] No.26718205{6}[source]
No, they could not. And if you don't want to trust $random_manufacturer's Android ROM, you could switch to GrapheneOS[0] whose developer Daniel Micay attaches a lot of importance to reliable app signatures (which is why GrapheneOS doesn't come with MicroG as the latter would need signature spoofing).

[0]: https://grapheneos.org/

21. corty ◴[] No.26718465{5}[source]
It isn't advanced difficult traffic analysis if it is all your servers. Or all your logs landing in one logstash.
replies(2): >>26718484 #>>26721124 #
22. tptacek ◴[] No.26718484{6}[source]
What difference does this make? In your threat model the only serious countermeasure between you and state-level adversaries is a Logstash implementation?
23. morelisp ◴[] No.26721124{6}[source]
The goalposts now seem to be at "someone might subpoena Signal's logs for some metadata", having moved pretty far from the original claim of "Signal's server code hasn't been updated because it has been secretly backdoored or intentionally weakened." It's difficult to see this as good faith security analysis rather than fearmongering.