Signal is finally tightening its desktop client's security by changing how it stores plain text encryption keys for the data store after downplaying the issue since 2018.
Privacy is not anonimity though. Privacy simply means that private data is not disclosed or used to parties and for purposes that the data owner doesn’t explicitly allow.
Often not collecting data is a way to ensure no misuse (and no compromise, hence security), but it’s not necessarily always the case.
It’s better now, but for years and years all they used for contact discovery was simple hashing… problem is the dataset is very small, and it was easy to generate a rainbow table of all the phone number hashes in a matter of hours. Then anyone with access to the hosts (either hackers, or the US state via AWS collaboration) had access to the entire social graph.
What I’m saying though is that for the longest time they didn’t, and when they changed the technique they hardly acknowledge that it was a problem in the past and that essentially every users social graph had been compromised for years.
Signal, originally known as TextSecure, worked entirely over text messages when it first came out. It was borne from a different era and and securing communication data was the only immediate goal because at the time everything was basically viewable by anyone with enough admin rights on basically every platform. Signal helped popularize end-to-end encryption (E2EE) and dragged everyone else with them. Very few services at the time even advertised E2EE, private metadata or social graph privacy.
As they’ve improved the platform they continue to make incremental changes to enhance security. This is not a flaw, this is how progress is made.
A company that requires using a phone number prides itself in security?
privacy != anonymity != security
But in some way, privacy ≈ security. Very intertwined.
Privacy is not anonimity though. Privacy simply means that private data is not disclosed or used to parties and for purposes that the data owner doesn’t explicitly allow. Often not collecting data is a way to ensure no misuse (and no compromise, hence security), but it’s not necessarily always the case.
Right, and often for that to be the case, the transferring and storing of data should be secure.
I’m mostly just pointing out the fact that when you do x ≠ y ≠ z, it can still be the case that x = z, e.g. 4 ≠ 3 ≠ 4.
Just nitpicking, perhaps.
Whats the vulnerability with Signal and phone numbers?
It’s better now, but for years and years all they used for contact discovery was simple hashing… problem is the dataset is very small, and it was easy to generate a rainbow table of all the phone number hashes in a matter of hours. Then anyone with access to the hosts (either hackers, or the US state via AWS collaboration) had access to the entire social graph.
Yeah the way I remember it, they put a lot of effort into masking that social graph. That was a while back too, not recent.
What I’m saying though is that for the longest time they didn’t, and when they changed the technique they hardly acknowledge that it was a problem in the past and that essentially every users social graph had been compromised for years.
Signal, originally known as TextSecure, worked entirely over text messages when it first came out. It was borne from a different era and and securing communication data was the only immediate goal because at the time everything was basically viewable by anyone with enough admin rights on basically every platform. Signal helped popularize end-to-end encryption (E2EE) and dragged everyone else with them. Very few services at the time even advertised E2EE, private metadata or social graph privacy.
As they’ve improved the platform they continue to make incremental changes to enhance security. This is not a flaw, this is how progress is made.
Bam!
Did it get leaked or something?