|
In order to ensure that this thread meets TL standards and follows the proper guidelines, we ask that everyone please adhere to this mod note. Posts containing only Tweets or articles adds nothing to the discussions. Therefore, when providing a source, explain why you feel it is relevant and what purpose it adds to the discussion. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments will be actioned upon. All in all, please continue to enjoy posting in TL General and partake in discussions as much as you want! But please be respectful when posting or replying to someone. There is a clear difference between constructive criticism/discussion and just plain being rude and insulting. https://www.registertovote.service.gov.uk |
On March 28 2017 00:35 LightSpectra wrote:Show nested quote +On March 28 2017 00:30 Acrofales wrote:On March 28 2017 00:23 LightSpectra wrote:On March 28 2017 00:18 Plansix wrote: But private, portable, anonymous communication that is completely immune to government requests/orders for access is a pretty terrible idea too. I don't see why. Programs like Signal and WhatsApp are just time savers, for convenience really. Anybody can invent an encryption scheme that's impossible for any person or government to crack. Agreed. I think WhatsApp/Facebook/etc. should work with authorities to allow things like wiretaps in their services (this could be done by targeted removing of the encryption scheme, or switching the encryption key with a dud, which the user *should* not know was happening. That's functionally a backdoor and is terrible for all of the same reasons. Any communication app that does not warn you when the contact's public key has been altered is just as worthless as an unencrypted one. Show nested quote +On March 28 2017 00:34 Plansix wrote: don’t think private communication is bad or something people shouldn’t have. I think it is a product people should be able to get. I’m not convinced it should be a free app that can be downloaded onto every phone, with global reach.
So what's your argument, privacy shouldn't be democratized but a privilege of the few? Like I said, any person, even a stupid person, can invent an encrypted communication protocol that is unbreakable by any person or government. Programs like WhatsApp are just saving time. If they get backdoored, bad people will revert back to word salad, but regular people will have to suffer from unwarranted surveillance.
It's not the same thing at all. In fact, it makes me wonder if you even know what a backdoor is.
What the government is saying now (and in the San Bernardino Apple case in the US too) is: if a posteriori we get a phone, we want to get to be able to get to all the encrypted data on it. To be able to do this, you'd have to compromise ALL security for everybody, because you cannot a priori say which phones you'll want to take data from a posteriori.
What the government SHOULD want is to be able to say a priori (as with current wiretaps) that they are interested in communication to/from that phone. This doesn't compromise security for all phones, but only for those that they can get a court order for. There is no need for a "master" key or a backdoor: whatsapp can "switch out" their entire app for a compromised app in a regular update. But they don't have to do that to everybody.
And hackers? If hackers have that kind of control over whatsapp's systems (and the Google and Apple systems, which would probably need to be used for this type of thing they can already do that).
|
I am not of the opinion that every citizen in a country needs access to encrypted, anonymous communication with unlimited reach from their phone. Encrypted communication for known individuals in a data base kept by the person offering that service, maybe.
Communication no one can access presents problems for a number of fields, but especially law. Anonymous communication is also problematic for law. And with the rise of smart phones, it has only gotten worse. It was hard enough proving that someone sat at the family community and sent an email/message. Now the computer is portable.
I understand the concerns of folks in the tech industry about encryption and how important it is. And their frustration with people outside the tech industry not understanding why a backdoor is a terrible idea. I just with people in the tech industry understood all the problems their products create for everyone else trying to make systems that work around them. Like the legal field and law enforcement.
|
On March 28 2017 00:52 Plansix wrote: Communication no one can access presents problems for a number of fields, but especially law. Anonymous communication is also problematic for law. And with the rise of smart phones, it has only gotten worse. It was hard enough proving that someone sat at the family community and sent an email/message. Now the computer is portable.
I'm sorry to break the news to you but inaccessible, anonymous communication has existed before smartphones, and will continue to exist even if Apple, Microsoft, Google, and Facebook are compromised.
So we have a choice, either we can make everybody secure (including innocent people); or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort.
Since I believe in the principles of "innocent until proven guilty" and "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated", I know which side I'm on.
On March 28 2017 00:50 Acrofales wrote:Show nested quote +On March 28 2017 00:35 LightSpectra wrote:On March 28 2017 00:30 Acrofales wrote:On March 28 2017 00:23 LightSpectra wrote:On March 28 2017 00:18 Plansix wrote: But private, portable, anonymous communication that is completely immune to government requests/orders for access is a pretty terrible idea too. I don't see why. Programs like Signal and WhatsApp are just time savers, for convenience really. Anybody can invent an encryption scheme that's impossible for any person or government to crack. Agreed. I think WhatsApp/Facebook/etc. should work with authorities to allow things like wiretaps in their services (this could be done by targeted removing of the encryption scheme, or switching the encryption key with a dud, which the user *should* not know was happening. That's functionally a backdoor and is terrible for all of the same reasons. Any communication app that does not warn you when the contact's public key has been altered is just as worthless as an unencrypted one. On March 28 2017 00:34 Plansix wrote: don’t think private communication is bad or something people shouldn’t have. I think it is a product people should be able to get. I’m not convinced it should be a free app that can be downloaded onto every phone, with global reach.
So what's your argument, privacy shouldn't be democratized but a privilege of the few? Like I said, any person, even a stupid person, can invent an encrypted communication protocol that is unbreakable by any person or government. Programs like WhatsApp are just saving time. If they get backdoored, bad people will revert back to word salad, but regular people will have to suffer from unwarranted surveillance. It's not the same thing at all. In fact, it makes me wonder if you even know what a backdoor is. What the government is saying now (and in the San Bernardino Apple case in the US too) is: if a posteriori we get a phone, we want to get to be able to get to all the encrypted data on it. To be able to do this, you'd have to compromise ALL security for everybody, because you cannot a priori say which phones you'll want to take data from a posteriori.
Correct so far...
What the government SHOULD want is to be able to say a priori (as with current wiretaps) that they are interested in communication to/from that phone. This doesn't compromise security for all phones, but only for those that they can get a court order for. There is no need for a "master" key or a backdoor: whatsapp can "switch out" their entire app for a compromised app in a regular update. But they don't have to do that to everybody.
But if Facebook CAN do that for anybody using WhatsApp, that's a backdoor, and it de facto means they can use it against everybody, with or without a warrant, and also that bad people can use that same exploit as well.
Your proposed backdoor is not an original idea, the NSA and similar organizations have been using malicious updates as an exploit for about two decades. There are safeguards against it. Some of those have been overcome, for some it appears not yet. In the case of the San Bernardino shooter's iPhone 5c, the FBI wanted to use Apple's private key in order to do a malicious patch which would bypass the lock screen. A second ago you said that's bad because it makes everybody's phones vulnerable to the same thing (which is true). But what you're proposing for WhatsApp is literally the same thing.
Repeat after me: there are no exploits that only work when there's a court order. None. There is no such thing and it's doubtful there ever will be.
|
On March 28 2017 01:19 LightSpectra wrote:Show nested quote +On March 28 2017 00:52 Plansix wrote: Communication no one can access presents problems for a number of fields, but especially law. Anonymous communication is also problematic for law. And with the rise of smart phones, it has only gotten worse. It was hard enough proving that someone sat at the family community and sent an email/message. Now the computer is portable. I'm sorry to break the news to you but inaccessible, anonymous communication has existed before smartphones, and will continue to exist even if Apple, Microsoft, Google, and Facebook are compromised. So we have a choice, either we can make everybody secure (including innocent people); or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. Since I believe in the principles of "innocent until proven guilty" and "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated", I know which side I'm on. This is the sort of "you don't understand our magical ways silly pleb, so let us do what we want. We are protecting everyone from your stupid ideas," I'm talking about. This is why people who are not in the tech field are so gun-ho about regulating encryption. Because every time someone tries to talk about the very real problems it creates when released on the mass market, both civil and criminal, they get told they are stupid and don't get it. So then laws get passed forcing these companies to care about the problems they are creating.
|
On March 28 2017 01:30 Plansix wrote:Show nested quote +On March 28 2017 01:19 LightSpectra wrote:On March 28 2017 00:52 Plansix wrote: Communication no one can access presents problems for a number of fields, but especially law. Anonymous communication is also problematic for law. And with the rise of smart phones, it has only gotten worse. It was hard enough proving that someone sat at the family community and sent an email/message. Now the computer is portable. I'm sorry to break the news to you but inaccessible, anonymous communication has existed before smartphones, and will continue to exist even if Apple, Microsoft, Google, and Facebook are compromised. So we have a choice, either we can make everybody secure (including innocent people); or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. Since I believe in the principles of "innocent until proven guilty" and "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated", I know which side I'm on. This is the sort of "you don't understand our magical ways silly pleb, so let us do what we want. We are protecting everyone from your stupid ideas," I'm talking about. This is why people who are not in the tech field are so gun-ho about regulating encryption. Because every time someone tries to talk about the very real problems it creates when released on the mass market, both civil and criminal, they get told they are stupid and don't get it. So then laws get passed forcing these companies to care about the problems they are creating.
I'm sorry if I come off condescending, but what do you want us to do? We've been trying to drill it in peoples' heads forever now, that:
1. It's impossible to make an exploit only the good guys can use.
2. It's impossible to take away all of the un-backdoored encryption software and encryption methods from the public domain so bad people can't use them. If you backdoor some software, bad people will just go to those.
3. It's impossible to make an exploit that only works when there's a valid legal warrant.
So yeah, here's what it really comes down to: either we can make everybody secure by default; or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort.
It's really that simple and I don't know what you want me to do about it. People like Donald Trump and Diane Feinstein throw temper tantrums and accuse tech companies of treason, but no matter what you do about it, the above facts aren't going to change.
|
I have news for you. Facebook CAN already do that. They have total control over the WhatsApp software. If Facebook were to push an update to you tomorrow that instead of encrypting your texts, sent a plain text copy to your worst enemy, the first you would know about it is when your worst enemy uses them against you.
The reason they don't is because it is not in their best interest.
But lets take a step back, because I am not sure you actually know what you're talking about. You speak about exploits and backdoors, but I am talking about a modern equivalent to wiretapping.
Do you agree with me that it is a good thing that government can wiretap phones? And that, if we can sort out internet messaging services to be equally secure to the general public as phone conversations, and get "wiretapping" in there as well, that would be a perfectly decent solution?
Note, that I am explicitly not talking about cracking open a confiscated phone, which is a completely different problem.
|
On March 28 2017 01:37 LightSpectra wrote:Show nested quote +On March 28 2017 01:30 Plansix wrote:On March 28 2017 01:19 LightSpectra wrote:On March 28 2017 00:52 Plansix wrote: Communication no one can access presents problems for a number of fields, but especially law. Anonymous communication is also problematic for law. And with the rise of smart phones, it has only gotten worse. It was hard enough proving that someone sat at the family community and sent an email/message. Now the computer is portable. I'm sorry to break the news to you but inaccessible, anonymous communication has existed before smartphones, and will continue to exist even if Apple, Microsoft, Google, and Facebook are compromised. So we have a choice, either we can make everybody secure (including innocent people); or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. Since I believe in the principles of "innocent until proven guilty" and "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated", I know which side I'm on. This is the sort of "you don't understand our magical ways silly pleb, so let us do what we want. We are protecting everyone from your stupid ideas," I'm talking about. This is why people who are not in the tech field are so gun-ho about regulating encryption. Because every time someone tries to talk about the very real problems it creates when released on the mass market, both civil and criminal, they get told they are stupid and don't get it. So then laws get passed forcing these companies to care about the problems they are creating. I'm sorry if I come off condescending, but what do you want us to do? We've been trying to drill it in peoples' heads forever now, that: 1. It's impossible to make an exploit only the good guys can use. 2. It's impossible to take away all of the un-backdoored encryption software and encryption methods from the public domain so bad people can't use them. If you backdoor some software, bad people will just go to those. 3. It's impossible to make an exploit that only works when there's a valid legal warrant. So yeah, here's what it really comes down to: either we can make everybody secure by default; or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. It's really that simple and I don't know what you want me to do about it. People like Donald Trump and Diane Feinstein throw temper tantrums and accuse tech companies of treason, but no matter what you do about it, the above facts aren't going to change.
If we are still talking about anything client-server you are wrong on all three accounts. Especially as if you do not control the server, you have already outsourced your security to a 3rd party. And that is exactly the case with messaging services (and any cloud service).
If you are talking about a sealed off system (like a lone iPhone) you are right. Although even in this case, it is possible to create a compromised iPhone without compromising all iPhones. It's just not possible to create a button that compromises a previously uncompromised iPhone (at a distance), without compromising all iPhones.
|
On March 28 2017 01:42 Acrofales wrote: I have news for you. Facebook CAN already do that. They have total control over the WhatsApp software. If Facebook were to push an update to you tomorrow that instead of encrypting your texts, sent a plain text copy to your worst enemy, the first you would know about it is when your worst enemy uses them against you.
If what you're saying is true, then WhatsApp is already backdoored.
Although it's an easily mitigated backdoor, I suppose, since one could simply turn off automatic updates.
But lets take a step back, because I am not sure you actually know what you're talking about. You speak about exploits and backdoors, but I am talking about a modern equivalent to wiretapping.
Do you agree with me that it is a good thing that government can wiretap phones? And that, if we can sort out internet messaging services to be equally secure to the general public as phone conversations, and get "wiretapping" in there as well, that would be a perfectly decent solution?
Honestly it doesn't really sound like you know what you're talking about. Wiretaps work because the data (i.e. the conversation over the phone) is not encrypted in transit. You can do the same thing to any website that goes over HTTP.
Yes I think it's a good idea for the government to wiretap phones, since average folk should be made very aware that they're highly vulnerable. That's why terrorists don't use normal phones for conversations, they either used encrypted digital communications (which, I remind you, if whatever app they're using proves to be backdoored, they will go to another channel), or burner phones.
|
On March 28 2017 01:37 LightSpectra wrote:Show nested quote +On March 28 2017 01:30 Plansix wrote:On March 28 2017 01:19 LightSpectra wrote:On March 28 2017 00:52 Plansix wrote: Communication no one can access presents problems for a number of fields, but especially law. Anonymous communication is also problematic for law. And with the rise of smart phones, it has only gotten worse. It was hard enough proving that someone sat at the family community and sent an email/message. Now the computer is portable. I'm sorry to break the news to you but inaccessible, anonymous communication has existed before smartphones, and will continue to exist even if Apple, Microsoft, Google, and Facebook are compromised. So we have a choice, either we can make everybody secure (including innocent people); or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. Since I believe in the principles of "innocent until proven guilty" and "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated", I know which side I'm on. This is the sort of "you don't understand our magical ways silly pleb, so let us do what we want. We are protecting everyone from your stupid ideas," I'm talking about. This is why people who are not in the tech field are so gun-ho about regulating encryption. Because every time someone tries to talk about the very real problems it creates when released on the mass market, both civil and criminal, they get told they are stupid and don't get it. So then laws get passed forcing these companies to care about the problems they are creating. I'm sorry if I come off condescending, but what do you want us to do? We've been trying to drill it in peoples' heads forever now, that: 1. It's impossible to make an exploit only the good guys can use. 2. It's impossible to take away all of the un-backdoored encryption software and encryption methods from the public domain so bad people can't use them. If you backdoor some software, bad people will just go to those. 3. It's impossible to make an exploit that only works when there's a valid legal warrant. So yeah, here's what it really comes down to: either we can make everybody secure by default; or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. It's really that simple and I don't know what you want me to do about it. People like Donald Trump and Diane Feinstein throw temper tantrums and accuse tech companies of treason, but no matter what you do about it, the above facts aren't going to change.
Not to be condescending, but you fail to get the basic point. Security so strong that no even the creators can break it may not be compatible with the criminal justice system we have in place in the EU and US. Or their the efforts to address national security.
And we are not talking about just terrorist. This type of encryption is a problem for even low level crimes. If someone is smart enough to sort everything on their Iphone and refuse to unlock it, even a simple harassment case can be so cumbersome for law enforcement that they can't do anything. Civil litigation has similar problems, though to a lesser extent.
|
On March 28 2017 01:51 Plansix wrote:Show nested quote +On March 28 2017 01:37 LightSpectra wrote:On March 28 2017 01:30 Plansix wrote:On March 28 2017 01:19 LightSpectra wrote:On March 28 2017 00:52 Plansix wrote: Communication no one can access presents problems for a number of fields, but especially law. Anonymous communication is also problematic for law. And with the rise of smart phones, it has only gotten worse. It was hard enough proving that someone sat at the family community and sent an email/message. Now the computer is portable. I'm sorry to break the news to you but inaccessible, anonymous communication has existed before smartphones, and will continue to exist even if Apple, Microsoft, Google, and Facebook are compromised. So we have a choice, either we can make everybody secure (including innocent people); or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. Since I believe in the principles of "innocent until proven guilty" and "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated", I know which side I'm on. This is the sort of "you don't understand our magical ways silly pleb, so let us do what we want. We are protecting everyone from your stupid ideas," I'm talking about. This is why people who are not in the tech field are so gun-ho about regulating encryption. Because every time someone tries to talk about the very real problems it creates when released on the mass market, both civil and criminal, they get told they are stupid and don't get it. So then laws get passed forcing these companies to care about the problems they are creating. I'm sorry if I come off condescending, but what do you want us to do? We've been trying to drill it in peoples' heads forever now, that: 1. It's impossible to make an exploit only the good guys can use. 2. It's impossible to take away all of the un-backdoored encryption software and encryption methods from the public domain so bad people can't use them. If you backdoor some software, bad people will just go to those. 3. It's impossible to make an exploit that only works when there's a valid legal warrant. So yeah, here's what it really comes down to: either we can make everybody secure by default; or we can make everybody insecure by default, with the option for any person--innocent or criminal-- to become secure with just a little bit of effort. It's really that simple and I don't know what you want me to do about it. People like Donald Trump and Diane Feinstein throw temper tantrums and accuse tech companies of treason, but no matter what you do about it, the above facts aren't going to change. Not to be condescending, but you fail to get the basic point. Security so strong that no even the creators can break it may not be compatible with the criminal justice system we have in place in the EU and US. Or their the efforts to address national security.
Maybe it's my fault for poorly communicating, but what I'm trying to get through to you is that anybody in the world with nothing more than paper and a pencil can make an encrypted protocol with "security so strong that no even the creators can break it."
Programs like WhatsApp are encrypted in transit (i.e. the software handles the encryption for you). Even if they were all backdoored, that does nothing to stop the communicants from making the content encrypted. If you dump my phone and all of my text messages are word salad like "Parrot knucklehead triathlon Wichita", nobody in the world is going to compromise it. Period.
Even if you backdoored every communications app in the world, a terrorist/smuggler/whomever can still encrypt the content. The only difference is that you're making it less convenient. For bad people like terrorists/smuglers/whomever, that's not going to stop them. It's only going to stop ordinary people from encrypting the texts they send to their spouse and whatnot, opening them up to mass surveillance.
|
On March 28 2017 01:19 LightSpectra wrote: Repeat after me: there are no exploits that only work when there's a court order. None. There is no such thing and it's doubtful there ever will be.
This is an excellent principle for designing government spyware, even if it isn't foolproof.
Part 1: Anyone that can issue a warrant has cryptographic credentials. There's a government auth server for these. Part 2: Use DRM tech so that the spyware won't operate without a cryptographically signed e-warrent that specifies the device to be infected.
It won't stop the spyware from being modified, but it prevents people with access to the spyware from misusing it.
|
There is a huge difference between someone creating their own personal a cipher and a billion dollar company making encryption for a mass market product. Critically that the person creating the cipher has to teach it to the person they are communicating with and it unique only to them.
|
On March 28 2017 01:56 Buckyman wrote:Show nested quote +On March 28 2017 01:19 LightSpectra wrote: Repeat after me: there are no exploits that only work when there's a court order. None. There is no such thing and it's doubtful there ever will be. This is an excellent principle for designing government spyware, even if it isn't foolproof. Part 1: Anyone that can issue a warrant has cryptographic credentials. There's a government auth server for these. Part 2: Use DRM tech so that the spyware won't operate without a cryptographically signed e-warrent that specifies the device to be infected. It won't stop the spyware from being modified, but it prevents people with access to the spyware from misusing it.
This is not an original idea: https://en.wikipedia.org/wiki/Key_escrow
It has all of the exact same flaws that all other backdoors do. The only difference is that the method you mention is slightly more sophisticated, so it will take a tad longer for the hacking community to bypass it.
And then, like in the scenario of every other backdoor, ordinary folk will be compromised, and bad people will not be.
|
On March 28 2017 01:58 Plansix wrote: There is a huge difference between someone creating their own personal a cipher and a billion dollar company making encryption for a mass market product. Critically that the person creating the cipher has to teach it to the person they are communicating with and it unique only to them.
That's no more burdensome than what terrorists/bad people are currently doing.
In the Parisian attacks of November 2015, the terrorists used burner phones. That means they were under the exact same limitations as a pair using a one-time pad.
Do you know how hard it is to make a one-time pad? Takes about a minute to google the command you have to type into your command prompt. Do you know how hard it is to teach someone to use one? Less than an hour if they're bright, possibly between 1-2 hours for somebody not-so-bright.
Is it worth compromising the communications of billions of innocent people in order to make terrorists have to spend ~2 hours to bypass it?
|
On March 28 2017 02:02 LightSpectra wrote:Show nested quote +On March 28 2017 01:58 Plansix wrote: There is a huge difference between someone creating their own personal a cipher and a billion dollar company making encryption for a mass market product. Critically that the person creating the cipher has to teach it to the person they are communicating with and it unique only to them. That's no more burdensome than what terrorists/bad people are currently doing. In the Parisian attacks of November 2015, the terrorists used burner phones. That means they were under the exact same limitations as a pair using a one-time pad.Do you know how hard it is to make a one-time pad? Takes about a minute to google the command you have to type into your command prompt. Do you know how hard it is to teach someone to use one? Less than an hour if they're bright, possibly between 1-2 hours for somebody not-so-bright. Is it worth compromising the communications of billions of innocent people in order to make terrorists have to spend ~2 hours to bypass it? Ok, you seem to be hyper focused on this terrorism thing. That isn’t the problem. If it was just limit to terrorism, the government’s request wouldn’t be that reasonable.
The problem is that it’s all criminal law, from drugs, racketeering, extortion, child pornography and everything else. If someone committed arson and a substantial amount of evidence is on their phone, the government may never be able to gain access. They may not be able to convict without access. When the encryption is simply the act of locking your phone, it presents a real problem for law enforcement at all levels. And this goes beyond just communication to keeping records and planning.
Now the argument is that it is easy to get this level of encryption. That may be true, but most criminals make stupid mistakes and fuck up. And the act of seeking out that level of encryption is evidence unto itself. But it is not proof of much if it is the default.
|
On March 28 2017 01:48 LightSpectra wrote:Show nested quote +On March 28 2017 01:42 Acrofales wrote: I have news for you. Facebook CAN already do that. They have total control over the WhatsApp software. If Facebook were to push an update to you tomorrow that instead of encrypting your texts, sent a plain text copy to your worst enemy, the first you would know about it is when your worst enemy uses them against you. If what you're saying is true, then WhatsApp is already backdoored. Although it's an easily mitigated backdoor, I suppose, since one could simply turn off automatic updates. Show nested quote +But lets take a step back, because I am not sure you actually know what you're talking about. You speak about exploits and backdoors, but I am talking about a modern equivalent to wiretapping.
Do you agree with me that it is a good thing that government can wiretap phones? And that, if we can sort out internet messaging services to be equally secure to the general public as phone conversations, and get "wiretapping" in there as well, that would be a perfectly decent solution? Honestly it doesn't really sound like you know what you're talking about. Wiretaps work because the data (i.e. the conversation over the phone) is not encrypted in transit. You can do the same thing to any website that goes over HTTP. Let me slap my dick on the table then. I have a PhD in Computer Science. Not in security, but I did take some MSc. level courses in security. So yes, I know telephone conversations are not encrypted. And I know that what we're talking about are encrypted conversations. However, they are a specific form of encrypted conversations. More below.
Yes I think it's a good idea for the government to wiretap phones, since average folk should be made very aware that they're highly vulnerable. That's why terrorists don't use normal phones for conversations, they either used encrypted digital communications (which, I remind you, if whatever app they're using proves to be backdoored, they will go to another channel), or burner phones.
I think you severely overestimate the average intelligence of a criminal. But yes, of course a smart criminal will take extra precautions.
Now lets say we want to listen in on an encrypted conversation. If we have an encrypted conversation between Alice and Bob, we are fucked. They have generated an encryption key and have shared it through a secure channel (lets say, a usb which was sent by post and not intercepted). Unless there are specific vulnerabilities in the protocol they use, or we can launch some other attack (for instance, hack Alice's computer and steal the key), we are never going to read what Alice and Bob are saying to each other.
However, that's not how WhatsApp works. WhatsApp (and all other similar services) come with some specific built-in vulnerabilities:
1. Neither Alice nor Bob are in charge of their encryption keys. They are provided by WhatsApp's software. 2. The messages are not routed directly from Alice to Bob, but are sent through WhatsApp's servers.
Now if we want to listen in on this conversation, we can modify this service fairly easily
We push an update of the WhatsApp app to Alice (our wiretapping target). This app has encryption switched off, but the first thing it does is send Alice's private key to the WhatsApp server.
Now Alice communicates with the WhatsApp server in plain text. However, the WhatsApp server encrypts all of Alice's outgoing communication with the corresponding public key. And decrypts all incoming messages with Alice's private key.
All of Alice's communication is completely compromised. Nobody else's is.
The only problem is how to get this wiretapped WhatsApp app onto the wiretapping target's phone. Now WhatsApp could come with a built-in button that simply does that. When it receives a specific code from the WhatsApp server, it will activate the "wiretap" protocol.
Why is this still (far) more secure than phone communication? Because to attack this, you'd either need access to WhatsApp's servers, or a very sophisticated man-in-the-middle attack. In both cases you would also need access to the code to switch on the "wiretap" protocol.
Would it be less secure than whatsapp is right now? Yes. But given that people were happy sharing intimate details of their lives over the phone, via email and plenty of other unencrypted lines, I am not sure that is a bad thing, because it is still FAR more secure than all those unencrypted systems.
|
I'm using "terrorists" as shorthand for all bad people that we wish we could unconditionally monitor.
I'm aware that law enforcement has faced some difficulties catching bad people because of their ease of access to an encrypted communications app and phone. What I'm trying, and failing apparently, to do is to make it clear that even if you backdoor one or many communications apps, there's always going to be more alternatives for bad people to use.
Backdoor WhatsApp. Backdoor Skype. Backdoor iMessage. It makes no difference. You will never be able to backdoor open source programs like Signal, GPG, Off-The-Record, etc. And there's absolutely no way to backdoor the universal concept of one-time pads and other encryption methods that anybody in the world can learn about within minutes of a Google search.
So, again, the choice remains: either everybody is compromised by default, but good AND bad people can become secure with a bit of effort; or everybody is secure by default. Those are the options.
|
On March 28 2017 02:17 LightSpectra wrote: I'm using "terrorists" as shorthand for all bad people that we wish we could unconditionally monitor.
I'm aware that law enforcement has faced some difficulties catching bad people because of their ease of access to an encrypted communications app and phone. What I'm trying, and failing apparently, to do is to make it clear that even if you backdoor one or many communications apps, there's always going to be more alternatives for bad people to use.
Backdoor WhatsApp. Backdoor Skype. Backdoor iMessage. It makes no difference. You will never be able to backdoor open source programs like Signal, GPG, Off-The-Record, etc. And there's absolutely no way to backdoor the universal concept of one-time pads and other encryption methods that anybody in the world can learn about within minutes of a Google search.
So, again, the choice remains: either everybody is compromised by default, but good AND bad people can become secure with a bit of effort; or everybody is secure by default. Those are the options.
Absolutely. Encryption is here to stay. And smart criminals will be ahead of the curve and use whatever system is most convenient to them that ensures safe communication. However, smart criminals were already avoiding wiretaps the second wiretaps were invented (slightly after telephones were). Wiretaps still had enormous use in stopping average and dumb criminals.
|
The justice system functions because there is a reasonable expectation that any order to release information will be complied with. If bank records are needed to prove the state's case or mount a defense against it, the bank will provide them. The same with phone records or any other form of documentation. They may redact information, but they will still provide what they can and justify redacting the information they did.
Companies like Facebook, Google, WhatsApp and others subvert this exception. They offer the ability to communicate through their services, but wash themselves of responsibility to provide the goverment with access. They build locks and keys to assure they don't have access, even to the data on their own servers. And then they make the argument that providing access to that data on the servers they control would be to risky for all their customers.
The argument the goverment and others have been making is if that is sustainable. If they can co-exist with these companies that have active created systems to make it impossible for them to comply with court orders. And tied those systems together to make the argument that they are protecting their clients privacy. If the WhatsApp is any different than a phone company, which is required to keep records and provide them to the court if ordered to do so.
And all of these systems and services were designed to make a profit and place a little burden on the company as possible. To protect them from liability. The companies can argue that it is for the good for everyone, but that is also marketing. Our services are so private even the government can’t get in.
|
On March 28 2017 02:14 Acrofales wrote:Show nested quote +On March 28 2017 01:48 LightSpectra wrote:On March 28 2017 01:42 Acrofales wrote: I have news for you. Facebook CAN already do that. They have total control over the WhatsApp software. If Facebook were to push an update to you tomorrow that instead of encrypting your texts, sent a plain text copy to your worst enemy, the first you would know about it is when your worst enemy uses them against you. If what you're saying is true, then WhatsApp is already backdoored. Although it's an easily mitigated backdoor, I suppose, since one could simply turn off automatic updates. But lets take a step back, because I am not sure you actually know what you're talking about. You speak about exploits and backdoors, but I am talking about a modern equivalent to wiretapping.
Do you agree with me that it is a good thing that government can wiretap phones? And that, if we can sort out internet messaging services to be equally secure to the general public as phone conversations, and get "wiretapping" in there as well, that would be a perfectly decent solution? Honestly it doesn't really sound like you know what you're talking about. Wiretaps work because the data (i.e. the conversation over the phone) is not encrypted in transit. You can do the same thing to any website that goes over HTTP. Let me slap my dick on the table then. I have a PhD in Computer Science. Not in security, but I did take some MSc. level courses in security. So yes, I know telephone conversations are not encrypted. And I know that what we're talking about are encrypted conversations. However, they are a specific form of encrypted conversations. More below.
Great, then since you know so much about such an obvious solution, go make a couple million dollars as a government consultant to politicians and stop wasting my time.
I think you severely overestimate the average intelligence of a criminal.
Every terrorist attack that has succeeded in the past 15 years in the first-world is proof that a couple average guys with a negligible amount of resources can overcome an intercontinental multi-trillion dollar surveillance apparatus.
The problem isn't that terrorists are geniuses, the problem is that encryption actually exists in nature and no force in the world can stop bad people from learning about it.
However, that's not how WhatsApp works. WhatsApp (and all other similar services) come with some specific built-in vulnerabilities:
1. Neither Alice nor Bob are in charge of their encryption keys. They are provided by WhatsApp's software. 2. The messages are not routed directly from Alice to Bob, but are sent through WhatsApp's servers.
My focus is not on WhatsApp per se but on really any encrypted communications app in widespread use (Signal, iMessage, et al.). But I'll play along.
Now if we want to listen in on this conversation, we can modify this service fairly easily
We push an update of the WhatsApp app to Alice (our wiretapping target). This app has encryption switched off, but the first thing it does is send Alice's private key to the WhatsApp server.
Now Alice communicates with the WhatsApp server in plain text. However, the WhatsApp server encrypts all of Alice's outgoing communication with the corresponding public key. And decrypts all incoming messages with Alice's private key.
All of Alice's communication is completely compromised. Nobody else's is.
What you're not understanding is that this in fact a backdoor because Facebook can use this exact method to exploit ANYBODY using WhatsApp, not just people for whom they have a court order.
If what you're saying is right, then WhatsApp is compromised already and no sane person should use it. (I'm of that opinion since I don't trust any proprietary software at all.)
However I don't think this is the case (I could be wrong), since the ability in the source code to arbitrarily send the private key to Facebook without telling the user wasn't uncovered during its security audit. So as said before, there'd need to be a malicious update done to the phone in order to insert this backdoor. That can be mitigated easily in many ways.
Why is this still (far) more secure than phone communication? Because to attack this, you'd either need access to WhatsApp's servers, or a very sophisticated man-in-the-middle attack. In both cases you would also need access to the code to switch on the "wiretap" protocol.
So it's a backdoor, just one that takes a little bit more concentrated effort to overcome.
|
|
|
|