Signal isn’t federated. Signal has centralized servers. Signal requires phone number identification to use it. Signal stores your encryption key on their servers… Relying on sgx enclaves to ‘protrct’ your encryption key.
Signal can go down. Signal knows who you talk to, just by message timing. Signal knows how frequently you talk to someone. Signal can decrypt your traffic by attack their own sgx enclaves and extracting your encryption key.
These are all possible threats and capabilities. You have to decide what tradeoff makes sense to you. Fwiw I still use signal.
Signal asks users to set a pin/password which needs to be periodically reentered. This discourages people from using high entropy passwords like BIP38.
For the people who really care, they can disable The pin. I believe the client will generate a BIP 38 password randomly, and use that for the data encrypted in the SVR. But all the data is still uploaded to the cloud. So if there’s a problem with the SVR encoding, the BIP 38 password generation etc the data is still exploited
Not only do you have to care, everyone you talk to has to do the same thing, because if your counterparty has their key in the cloud, the conversation is at risk.
So my takeaways from this link and other critiques has been:
1.Signal doesn’t upload your messages anywhere, but things like your contacts (e.g. people you know the usernane/identifier, but not phone number of) can get backed up online
One challenge has been that if we added support for something like usernames in Signal, those usernames wouldn’t get saved in your phone’s address book. Thus if you reinstalled Signal or got a new device, you would lose your entire social graph, because it’s not saved anywhere else.
2. You can disable this backup and fully avert this issue. (You’ll lose registration lock if you do this.)
3. Short PINs should be considered breakable, and if you’re on this subreddit you should probably use a relatively long password like BIP39 or some similar randomly assigned mnemonic.
If an attacker is able to dump the memory space of a running Signal SGX enclave, they’ll be able to expose secret seed values as well as user password hashes. With those values in hand, attackers can run a basic offline dictionary attack to recover the user’s backup keys and passphrase. The difficulty of completing this attack depends entirely on the strength of a user’s password. If it’s a BIP39 phrase, you’ll be fine. If it’s a 4-digit PIN, as strongly encouraged by the UI of the Signal app, you will not be.
4. SGX should probably also be considered breakable, although this does appear to be an effort to prevent data from leaking.
The various attacks against SGX are many and varied, but largely have a common cause: SGX is designed to provide virtualized execution of programs on a complex general-purpose processor, and said processors have a lot of weird and unexplored behavior. If an attacker can get the processor to misbehave, this will in turn undermine the security of SGX.
Also, most of the points of the message you replied to are abstract and don’t need any citation. Like do you want source for signal being centralized or for signal having ability to track you?
Everything in that post makes perfect sense; the proof is in knowing how these systems work, Signal’s source code, and details from Signal themselves. I can go into more detail on each point when I’m at a computer; my phone kills processes in a few seconds when I try to multitask which makes it nearly impossible to write long posts on mobile if I have to go back and forth to copy and paste. Is there any claim in particular you want details on as to why it’s reasonable, or shall I just do the lot? Edit: Ah, OP got it, nevermind!
Also, I should point out that I use Signal pretty much exclusively for messaging. This isn’t hate, I’m just aware of its weaknesses.
I just read the post (you linked) by signal. Note the use of the word “plaintext”.
we don’t have a plaintext record of your contacts, social graph, profile name, location, group memberships, groups titles, group avatars, group attributes, or who is messaging whom.
Whenever someone qualifies a statement like this, without clarifying, it’s clear they’re trying to obfuscate something.
I don’t need to dig into the technical details to know it’s not as secure as they like to present themselves.
Thanks. I didn’t realize they were so disingenuous. This also explains why they stopped supporting SMS - it didn’t transit their servers (they’d have to add code to capture SMS, which people would notice).
Saying something has the capabilities of a honeypot, is the correct thing to do when we’re assessing our threat model.
Is it a honey pot? I don’t know. It’s unknowable. We have to acknowledge the the actual capabilities of the software as written and the data flows and the organizational realities.
My concern is people stay away from Signal in favor of unencrypted privacy nightmares. It happened with DDG a while back where I knew people who used Google because DDG had privacy issues. It sounds dumb but it is a true story.
Sure. I still encourage people to use signal. Most people don’t have a threat model that makes the honey pot scenario a viable threat. In this thread we are talking about its downsides, which is healthy to do from time to time. Acknowledging capabilities is a good exercise.
They have your key In a SGX enclave. You only need to look at the rich history of side channel attacks, known SGX critical vulnerabilities, or just the fact that Intel can sign arbitrary code, which can run in the enclave, which means they can be compelled to with the cooperation of the government
I’m not saying they do, but they have the capability, which needs to be accounted for in your threat model.
At the end of the day, people are entrusting their encryption keys with the signal foundation to be stored in the cloud. That needs to be part of the threat model.
i read some of your other comments too. this is insane. I’ve always hated signal but this is another reason on top. No wonder the CIA funded them for 10 years.
Signal isn’t federated. Signal has centralized servers. Signal requires phone number identification to use it. Signal stores your encryption key on their servers… Relying on sgx enclaves to ‘protrct’ your encryption key.
Signal can go down. Signal knows who you talk to, just by message timing. Signal knows how frequently you talk to someone. Signal can decrypt your traffic by attack their own sgx enclaves and extracting your encryption key.
These are all possible threats and capabilities. You have to decide what tradeoff makes sense to you. Fwiw I still use signal.
That would surprise me. What’s your source for this?
https://signal.org/blog/secure-value-recovery/
master_key is never stored or sent to the SGX, only c2, the entropy bits. The user’s password is still required to generate the key.
Brute forcing 4-6 digit pins is trivial.
And even if the user set a actual password, it’s still very trivial
https://blog.cryptographyengineering.com/2020/07/10/a-few-thoughts-about-signals-secure-value-recovery/
“Very trivial” if they set a proper password? Yet the source you provide says it’s robustly secure
I can’t find the phrase robustly secure in the last link:
https://blog.cryptographyengineering.com/2020/07/10/a-few-thoughts-about-signals-secure-value-recovery/
Signal asks users to set a pin/password which needs to be periodically reentered. This discourages people from using high entropy passwords like BIP38.
The password is literally a pin
If you set a small pin, perhaps. Most people set a password
Pin is the suggested option, so I really doubt “most” of the people choose password
Most people who care* I guess would be more apt
For the people who really care, they can disable The pin. I believe the client will generate a BIP 38 password randomly, and use that for the data encrypted in the SVR. But all the data is still uploaded to the cloud. So if there’s a problem with the SVR encoding, the BIP 38 password generation etc the data is still exploited
Not only do you have to care, everyone you talk to has to do the same thing, because if your counterparty has their key in the cloud, the conversation is at risk.
So my takeaways from this link and other critiques has been:
1.Signal doesn’t upload your messages anywhere, but things like your contacts (e.g. people you know the usernane/identifier, but not phone number of) can get backed up online
2. You can disable this backup and fully avert this issue. (You’ll lose registration lock if you do this.)
3. Short PINs should be considered breakable, and if you’re on this subreddit you should probably use a relatively long password like BIP39 or some similar randomly assigned mnemonic.
4. SGX should probably also be considered breakable, although this does appear to be an effort to prevent data from leaking.
One nit to pick, messages have to transit through the signal network. And they could be recorded during transit. Carnivore style
True, but that’s more or less out of the scope of this thread. I could go on for way longer about centralized versus federated services…
Many assertions without any proof. Could you at least point out the sources for such statements?
https://github.com/dessalines/essays/blob/main/why_not_signal.md
Also, most of the points of the message you replied to are abstract and don’t need any citation. Like do you want source for signal being centralized or for signal having ability to track you?
Everything in that post makes perfect sense; the proof is in knowing how these systems work, Signal’s source code, and details from Signal themselves. I can go into more detail on each point when I’m at a computer; my phone kills processes in a few seconds when I try to multitask which makes it nearly impossible to write long posts on mobile if I have to go back and forth to copy and paste. Is there any claim in particular you want details on as to why it’s reasonable, or shall I just do the lot? Edit: Ah, OP got it, nevermind!
Also, I should point out that I use Signal pretty much exclusively for messaging. This isn’t hate, I’m just aware of its weaknesses.
I just read the post (you linked) by signal. Note the use of the word “plaintext”.
Whenever someone qualifies a statement like this, without clarifying, it’s clear they’re trying to obfuscate something.
I don’t need to dig into the technical details to know it’s not as secure as they like to present themselves.
Thanks. I didn’t realize they were so disingenuous. This also explains why they stopped supporting SMS - it didn’t transit their servers (they’d have to add code to capture SMS, which people would notice).
They now seem like a honeypot.
They are very much not. Anyone who tells you this is a state influencer or someone who believed a state influencer.
Saying something has the capabilities of a honeypot, is the correct thing to do when we’re assessing our threat model.
Is it a honey pot? I don’t know. It’s unknowable. We have to acknowledge the the actual capabilities of the software as written and the data flows and the organizational realities.
My concern is people stay away from Signal in favor of unencrypted privacy nightmares. It happened with DDG a while back where I knew people who used Google because DDG had privacy issues. It sounds dumb but it is a true story.
Sure. I still encourage people to use signal. Most people don’t have a threat model that makes the honey pot scenario a viable threat. In this thread we are talking about its downsides, which is healthy to do from time to time. Acknowledging capabilities is a good exercise.
excuse me what? signal can extract your encryption key how exactly?
They have your key In a SGX enclave. You only need to look at the rich history of side channel attacks, known SGX critical vulnerabilities, or just the fact that Intel can sign arbitrary code, which can run in the enclave, which means they can be compelled to with the cooperation of the government
https://dl.acm.org/doi/fullHtml/10.1145/3456631
https://nvd.nist.gov/vuln/search/results?form_type=Basic&results_type=overview&query=SGX&search_type=all&isCpeNameSearch=false
I’m not saying they do, but they have the capability, which needs to be accounted for in your threat model.
At the end of the day, people are entrusting their encryption keys with the signal foundation to be stored in the cloud. That needs to be part of the threat model.
i read some of your other comments too. this is insane. I’ve always hated signal but this is another reason on top. No wonder the CIA funded them for 10 years.
Signal is still secure. If it wasn’t it wouldn’t be used in Military applications.
Secure within the context of a certain threat model.
The French government does not endorse signal for government communication as an example
And I would highly suspect the Russian government would not use signal either.
I cite both of these as examples of threat models that can’t ignore some of the potential capability of the signal.
In the US government organizations are trying to protect themselves from each other and themselves. (Its messy)
Not to say that Signal is perfect (its not) but if the DoD recommends it and has guidance on how to harden it then it can’t be to bad.