They could easily add a backdoor in the client despite the fact that it's "open source", because no one builds it from source.
The Signal android build now uses some PKCS11 machinery that requires patching out to build without using a smartcard, but otherwise it works as expected.
I dove into this darkness while trying to fix the borked MMS handling on Visible (a Verizon MVNO), and is the reason I'm generally with you: if someone can't build the project, then it's not effectively open source, IMHO, because I lose my "right to repair"
The rest of this could possibly be obtained, it it wouldn’t require a patch to the server as message sizes and timestamps likely appear on disk somewhere. Though the data is encrypted, you could tell “x received a message from some party (sealed sender prevents knowing who) at y time of roughly z size”.
Also, there is competition like Briar which has less of those pesky metadata problems (but some other problems instead)
Sealed sender also is based on the pinky-swear that the infrastructure distributing the sender auth certificates doesn't correlate identities and connections with the messaging infrastructure. And that the server receiving the enveloped messages doesn't log. So all based on trust based on believing the right source code is running somewhere.
When access to that source code is restricted suddenly, of course people are worried.
The app store is a single point of failure with huge reach.
But despite best efforts by the community to verify builds, Google and Apple can be forced to upload a malicious app to a particular user, meaning they aren't using the same app at all.
Hi there! Signal-Android developer here. App signing verification is done at the OS-level, and Google does not have our signing key, so they wouldn't be able to give an existing user a different APK and have it successfully install.
They've been known to reset passwords remotely in the past: https://www.theverge.com/2016/3/30/11330892/fbi-google-andro...