Why "Good Enough" Encryption Is Not Good Enough

While reading various conversations in the privacy community, often the argument comes up about cipher lengths, which ciphers are safe, and which are considered unsafe, and inevitably someone steps in to say the following:

"(Some cipher) is perfectly fine to use. There are no known ways to break (cipher) at (strength)."

This is dangerous thinking in the current surveillance environment. Organizations like the American NSA, the Russian FSB, the British GCHQ, and the Chinese MSS all have programs where encrypted data is logged and stored away for when it can be broken. Their strategy is the same as the sci-fi strategy of freezing a corpse until technology exists that can revive it, the only difference being that computing and cryptography move so fast that we are talking about years, not centuries to have this strategy bear fruit for the agency.

You only need to look as far as old ciphers to see how a progression of once "unbreakable" industry standards are now considered trivial. DES was the "go to" encryption in the 80's, much like AES is today. Today, cracking DES keys is considered trivial, thanks to advancements in computer hardware (codebreaking computers simply get faster) and new techniques against the way DES works.

This means that critical data that would have been intercepted in the 80s and 90s would now be trivial to break, and any secret information contained in that data set could now be read in plain text.

This discussion also often involves key lengths for RSA or Diffie Hellman handshakes, or for PGP/GnuPG based email encryption. The discussion invariably boils down to someone saying that "1024-bit is good enough" and others balking at the comment.

RSA, Diffie-Hellman, and Elliptic Curve handshakes have been broken in the wild with lengths as large as 768-bit. In the context of mathematics alone, the problem that these algorithms rely on is known as the discrete logarithm problem. There have been real-world cracks of "keys" as long as 6120-bit (with small prime numbers, which are not used in the real world and make the problem easier to solve). The times to complete factoring large numbers are only going to get shorter as time progresses.

Remember, you are not encrypting your data for right now. You are encrypting your data for 2035.

This means that you should be encrypting with the maximum possible cipher lengths and the best known techniques. The only limitations are realistic limitations on speed. This means when a service is selecting their ciphers and handshakes, they should be using selecting things like RSA-4096 and AES-256-CBC. For data at rest, you should use techniques like TrueCrypt that uses (used?) mixed ciphers like AES, TwoFish, Camellia, Serpent, and others in combination to maximize effectiveness. This means that even if one cipher is broken completely, the data is still protected by multiple other ciphers. This ensures that your data is protected with multiple layers of security that will be more likely to withstand the constant erosion of research and new hardware.

Always be skeptical of the "good enough" argument.

< last
next >