Why are politicians sometimes such idiots? The French recently voted to criminalise bosses of tech firms who refused to decrypt user data when requested. Don’t worry, it’s not law yet, and likely won’t ever be, but you have to ask yourself how hard it is to understand how encryption works.
The whole point of encryption as it is presently used is to make it impossible (technically, computationally unfeasible) for anyone to decrypt it unless they have the necessary private key. There is a clue in the word “private” in that sentence. In a properly-constructed encryption mechanism, only you – the user – have the private key. To make things easy for you, this is usually hidden behind a friendly interface, like your fingerprint or a password, but nonetheless it’s there.
If the tech company has done its job properly, they can’t decrypt your data. That’s the whole point. Otherwise anyone who penetrated their security, or any dodgy employee of the company, would have access to your private information. This is the point that Apple are making with regard to the security of the iPhone 6, but it’s true of any proper security implementation. It’s why key management is such an important topic when you use encryption in business – because if the private key leaks, all the data is compromised, but if the private key is lost, so is all the data – unrecoverably.
Why does this matter? Surely if you have nothing to hide you have nothing to fear? [How I hate that phrase!]. Because you have lots to hide. Your credit card and bank account numbers; your social security/national insurance number; your driving licence and passport details; your address and the dates when your house will be empty because you’re away. These are all pieces of information that could be used to impersonate and defraud you, or steal from you. They’re also all pieces of information that you routinely have to share with others – when buying things on-line, booking tickets, flying abroad, opening bank accounts, buying insurance and so on and so forth.
All of these transactions are protected by encryption. If we weaken that encryption – by building in “backdoors” – then the transactions won’t be secure any more. Anything that allows someone to decrypt a data exchange without knowing the private key prevents the encryption from being properly effective. Over the last few years we’ve retired SSL v1, SSL v2, SSL v3, TLS 1.0 and SHA-1 (a bunch of security standards) precisely because we (the security biz) found weaknesses in them that allowed decryption. We’ve also increased key lengths from 64-bits to 2048 (which is a much bigger difference than it sounds) to keep up with the improvements in computing power so that we can remain certain that you can’t just crack the encryption using brute force.
Even with all this effort, we’re only just staying ahead of the cyber-criminals. They’re a multi-billion dollar industry dedicated to stealing your information – and your money. Are we seriously suggesting that all of that effort to protect the consumer should be thrown away just to provide more data to security agencies – when we know that they’re already drowning in data they can’t use?