(originally posted 08.03.2016 on Google+)
Let’s assume Congress had passed a law requiring smartphone manufacturers to make their encrypted phones unlockable if requested by a legal order. How could smartphone manufacturers implement this?
Most internet activists argue this would require a “backdoor”, which always could be exploited by others (like criminals), thus putting all ordinary users at risk. But this is simply not true, or at least an absurd exaggeration. I explain how a secure implementation could look like.
(The following is a bit technical, I assume you are slightly familiar with encryption)
The manufacturer would introduce three sets of keys. The first key-pair is per device, where the manufacturer keeps a database with one private key per device. The second key-pair is a “master key”, where the private key is hardware-protected (key is not exportable, decrypt only possible with the hardware). The third key-pair is only stored at the device itself.
The goal is to store the PIN or passphrase of the user encrypted on the phone, so that only the manufacturer can decrypt it, and avoiding the usual attacks.
First a salt with a fixed length is added to the user’s passphrase. This salt is a random number, created by the phone at the time the PIN is set or changed. The result is encrypted with all three public keys and stored on the phone (and maybe backuped to the manufacturer -- the per-device key prevents the manufacturer from decrypting the PIN without having physical access to the phone).
Then the random salt is deleted. This step assures that the passphrase cannot be brute-forced. The salt itself is never needed.
To recover the PIN, the keyfile needs to be decrypted with all three private keys, and then the fixed length salt is cutted from the end. Voila.
An attacker who wants to decrypt the PIN needs:
- to be in physical access of the phone
- break in the manufacturer’s database
- have physical access to the “master key” hardware
Such a smart guy would anyway rule the world, wouldn’t he?
This implementation would ensure that the phone could only be unlocked with physical access to it, and only on the premises of the manufacturer. Possible good uses are not only law enforcement or foreign intelligence, but also if a user forgot his PIN or passphrase, or on request of his heirs, when he died.
And I deny that this would conflict with civil liberties. Every user who wants an “extra security” could additionally encrypt his data with other tools, he could use a “custom ROM” on his phone, or he could jailbreak or root it and then remove the “key escrow” (I’m pretty sure that instruction HOWTOs would emerge very quickly). Nobody would do something illegally then.
Civil liberties don't entitle you to use a smartphone out-of-the-box with a 4-digit PIN and even be "safe" from law enforcement with a legal warrant.
Some critics may say that bugs in algorithms or implementations may result in exploits. That’s correct, but we live with bugs anyway. Bugs are found and patched, that’s how security research works, and after each patched bug the system is more secure. Argumentation with possible bugs is absurd, because with every crypto implementation so far we had to face the risk of bugs. It is no more than hypocritical to on the one side argue that crypto is needed to make us all safer, but on the other side deny that crypto can also be used to implement a secure “key escrow”.
What I want to show with this post is that a secure smartphone “key escrow” is possible, and explained a quick idea. There are likely much better ideas, and the smartphone manufacturers for sure employ much better crypto specialists than I am (to be honest, I’m not really one). It is not that the manufacturers can not implement a secure “key escrow”, they simply don’t want. This is why I support that lawmakers force them. I’m absolutely sure that they suddenly have some ideas, then ...
Keine Kommentare:
Kommentar veröffentlichen