❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayTrail of Bits Blog

Quantum is unimportant to post-quantum

1 July 2024 at 13:00

By Opal Wright

You might be hearing a lot about post-quantum (PQ) cryptography lately, and it’s easy to wonder why it’s such a big deal when nobody has actually seen a quantum computer. But even if a quantum computer is never built, new PQ standards are safer, more resilient, and more flexible than their classical counterparts.

Quantum computers are a big deal; just ask around, and you’ll get plenty of opinions. Maybe quantum computers are on the verge of destroying public-key cryptography as we know it. Or maybe cryptographically significant quantum computers are an impossible pipe dream. Maybe the end of public-key cryptography isn’t now, but it’s only two decades away. Or maybe we have another 50 or 60 years because useful quantum computers have been two decades away for three decades, and we don’t expect that to change soon.

These opinions and predictions on quantum computers lead to many different viewpoints on post-quantum cryptography as well. Maybe we need to transition to post-quantum crypto right now, as quickly as we can. Maybe post-quantum crypto is a pipe dream because somebody will find a way to use quantum computers to break new algorithms, too. Maybe a major world government already has a quantum computer but is keeping it classified.

The fact of the matter is, it’s hard to know when a cryptographically significant quantum computer will exist until we see one. We can guess, we can try to extrapolate based on the limited data we have so far, and we can hope for one outcome or the other. But we can’t know with certainty.

That’s okay, though, because quantum resistance isn’t the main benefit of post-quantum crypto.

Current research and standards work will result in safer, more resilient cryptographic algorithms based on a diverse set of cryptographic problems. These algorithms benefit from the practical lessons of the last 40 years and provide use-case flexibility. Doomsayers and quantum skeptics alike should celebrate.

All in one basket

People who are worried about quantum computers often focus on one point, and they’re absolutely right about it: almost all public-key cryptography in wide use right now could be broken with just a few uncertain-but-possible advances in quantum computing.

Loosely speaking, the most commonly-used public-key algorithms are based on three problems: factoring (RSA), finite field discrete logarithms (Diffie-Hellman), and elliptic curve discrete logarithms (ECDH and ECDSA). These are all special instances of a more general computational problem called the hidden subgroup problem. And quantum computers are good at solving the hidden subgroup problem. They’re really good at it. So good that, if somebody builds a quantum computer of what seems like a reasonable size to many researchers, they can do all manner of nasty things. They can read encrypted messages. They can impersonate trusted organizations online. They can even use it to build tools for breaking some forms of encryption without quantum computers.

But even if quantum computing never becomes powerful enough to break current public keys, the fear of the quantum doomsayers is based on a completely valid observation: the internet has put nearly all of its cryptographic eggs into the single basket of the hidden subgroup problem. If somebody can efficiently solve the hidden subgroup problem, whether it’s with quantum computers or classical computers, they will be able to break the vast majority of public-key cryptography used on the internet.

What often gets overlooked is that, for the last 40 years, the hidden subgroup basket has consistently proven less safe than we expected.

Advances in factoring and discrete logs

In the 1987 talk β€œFrom Crossbows to Cryptography: Techno-Thwarting the State,” Chuck Hammill discussed RSA keys with 200 digits, or about 664 bits, saying that the most powerful supercomputers on earth wouldn’t be able to factor such a number in 100 years. The Unix edition of PGP 1.0 supported 992-bit RSA keys as its highest security level, saying the key size was β€œmilitary grade.”

Nowadays, formulas provided by the National Institute of Standards and Technology (NIST) suggest that a 664-bit key offers only about 65 bits of security and is firmly within the reach of motivated academic researchers. A 992-bit key offers only about 78 bits of security and is speculated to be within reach of intelligence agencies.

(The smallest key size supported in PGP 1.0, 288 bits, can be broken in about 10 minutes on a modern desktop computer using readily available software like msieve. β€œCommercial grade” keys were 512 bits, which can be factored using AWS in less than a day for under $100.)

Ever-increasing key sizes

In response to advances in factoring and discrete logarithm algorithms over the years, we’ve responded by doing the only thing we really knew how to do: increasing key sizes. Typical RSA key sizes these days are 2048 to 4096 bits, roughly three to six times longer than Chuck Hammill suggested, and two to four times the length of what an early version of PGP called a β€œmilitary grade” RSA key. The National Security Agency requires RSA keys no shorter than 3072 bits for classified data. The NIST formulas suggest that keys would need to be 15,360 bits long in order to match the security of a 256-bit AES key.

Finite field discrete logarithm key sizes have largely tracked RSA key sizes over the years. This is because the best algorithm for solving both problems is the same: index calculus using the general number field sieve (GNFS). There are some differences at the edges, but most of the hard work is the same. It’s worth pointing out that finite field discrete log cryptosystems have an additional downside: computing one discrete log in a finite field costs about the same as computing a lot of discrete logs.

Elliptic curves, which have become more popular over the last 15 years or so, have not seen the sort of changes in key size that happened with factoring and discrete log systems. Index calculus doesn’t translate well to elliptic curves, thank goodness, but elliptic curve discrete logarithms are an open area of research.

Implementation dangers

On top of the lack of problem diversity, another concern is that current algorithms are finicky and subject to subtle implementation failures.

Look, we’re Trail of Bits. We’re kinda famous for saying β€œfuck RSA,” and we say it mainly because RSA is full of landmines. Finite field Diffie-Hellman has subtle problems with parameter selection and weak subgroup attacks. Elliptic curve cryptosystems are subject to off-curve attacks, weak subgroup attacks, and attacks related to bad parameter selection.

Worse yet, every one of these algorithms requires careful attention to avoid timing side channel attacks!

Taken together, these pitfalls and subtle failure modes turn current public-key primitives into an absolute minefield for developers. It’s not uncommon for cryptography libraries to refer to their low-level functionality as β€œhazmat.” This is all before you move into higher-level protocols!

Many implementation concerns are at least partially mitigated through the use of good standards. Curve25519, for instance, was specifically designed for fast, constant-time implementations, as well as security against off-curve and weak subgroup attacks. Most finite field Diffie-Hellman key exchanges used for web traffic are done using a small number of standardized parameter sets that are designed to mitigate weak subgroup attacks. The ever-growing menagerie of known RSA attacks related to encryption and signing can (usually) be mitigated by using well-tested and audited RSA libraries that implement the latest standards.

Good standards have helped immensely, but they really just paper over some deeply embedded properties of these cryptosystems that make them difficult to use and dangerous to get wrong. Still, despite the consequences of errors and the availability of high-quality open-source libraries, Trail of Bits regularly finds dangerously flawed implementations of these algorithms in our code reviews.

What post-quantum crypto provides

So why is post-quantum crypto so much better? It’s instructive to look at the ongoing NIST post-quantum crypto standardization effort.

Diversity of problems

First of all, upcoming NIST standards are based on multiple mathematical problems:

  • CRYSTALS-KYBER, CRYSTALS-DILITHIUM, and Falcon are based on lattice problems: short integer solutions (SIS) and learning with errors (LWE) over various rings.
  • SPHINCS+ is based on the difficulty of second-preimage attacks for the SHA-256 and SHA-3 hash functions.

Additionally, NIST is attempting to standardize one or more additional signature algorithms, possibly based on different problems. Submissions include signature algorithms based on problems related to elliptic curve isogenies, error correcting codes, and multivariate quadratics.

By the time the next phase of standardization is over, we can expect to have algorithms based on at least three or four different mathematical problems. If one of the selected problems were to fall to advances in quantum or classical algorithms, there are readily-available replacements that are highly unlikely to be affected by attacks on the fallen cryptosystems.

Modern design

The post-quantum proposals we see today have been developed with the advantage of hindsight. Modern cryptosystem designers have seen the myriad ways in which current public-key cryptography fails in practice, and those lessons are being integrated into the fabric of the resulting designs.

Here are some examples:

  • Many post-quantum algorithms are designed to make constant-time implementations easy, reducing the risk of timing attacks.
  • Many algorithms reduce reliance on random number generators (RNGs) by extending nonce values with deterministic functions like SHAKE, preventing reliance on insecure RNGs.
  • Random sampling techniques for non-uniform distributions in the NIST finalists are fully specified and have been analyzed as part of the standardization effort, reducing the risk of attacks that rely on biased sampling.
  • Many post-quantum algorithms are fully deterministic in their input (meaning that encrypting or signing the same values with the same nonces will always produce the same results), reducing nonce reuse issues and the risk of information leakage if values are reused.
  • Many algorithms are designed to allow quick and easy generation of new keys, making it easier to provide forward secrecy.
  • Rather than inviting developers to dream up their own parameters, every serious proposal for a post-quantum cryptosystem lists a small set of secure parameterizations.

These are intentional, carefully-made decisions. Each is based on real-world failures that have shown up over the last 40 years or so. In cryptography, we often refer to these failure scenarios as β€œfootguns” because they make it easy to shoot yourself in the foot; the newer designs go out of their way to make it difficult.

Use-case flexibility

With new algorithms come new trade-offs, and there are plenty to be found in the post-quantum standards. Hash-based signatures can run to 50 kilobytes, but the public keys are tiny. Code-based systems like McEliece have small ciphertexts, and decrypt quicklyβ€”but the public keys can be hundreds of kilobytes.

This variety of different trade-offs gives developers a lot of flexibility. For an embedded device where speed and bandwidth are important but ROM space is cheap, McEliece might be a great option for key establishment. For server farms where processor time is cheap but saving a few bytes of network activity on each connection can add up to real savings, NTRUSign might be a good option for signatures. Some algorithms even provide multiple parameter sets to address different needs: SPHINCS+ includes parameter sets for β€œfast” signatures and β€œsmall” signatures at the same security level.

The downside of post-quantum: Uncertainty

Of course, one big concern is that everybody is trying to standardize cryptosystems that are relatively young. What if the industry (or NIST) picks something that’s not secure? What if they pick something that will break tomorrow?

The idea can even feel frighteningly plausible. RAINBOW made it to the third round of the NIST PQC standardization effort before it was broken. SIKE made it to the (unplanned) fourth round before it was broken.

Some folks worry that a new standard could suffer the same fate as RAINBOW and SIKE, but not until after it has been widely adopted in industry.

But here’s a scary fact: we already run that risk. From a mathematical standpoint, there’s no proof that RSA moduli can’t be factored easily. There’s no proof that breaking RSA, as it’s used today, is equivalent to factoring (the opposite is true, in fact). It’s completely possible that somebody could publish an algorithm tomorrow that totally destroys Diffie-Hellman key exchanges. Somebody could publish a clever paper next month that shows how to recover private ECDSA keys.

An even scarier fact? If you squint a little, you’ll see that big breaks have already happened with factoring and finite field discrete logs. As mentioned above, advances with the GNFS have been pushing up RSA and Diffie-Hellman key sizes for over two decades now. Keys that would have been considered fine in 1994 are considered laughable in 2024. RSA and Diffie-Hellman from the old cipherpunk days are already broken. You just didn’t notice they’re broken because it took 30 years to happen, with keys getting bigger all the while.

I don’t mean to sound glib. Serious researchers have put in a lot of effort over the last few years to study new post-quantum systems. And, sure, it’s possible they missed something. But if you’re really worried about the possibility that somebody will find a way to break SPHINCS or McEliece or CRYSTALS-KYBER or FALCON, you can keep using current algorithms for a while. Or you could switch to a hybrid cryptography system, which marries post-quantum and pre-quantum methods together in a way that should stay secure as long as both are not broken.

Summing up

Fear of quantum computers may or may not be overblown. We just don’t know yet. But the effect of post-quantum crypto research and standardization efforts is that we’ve taken a ton of eggs out of one basket and we’re building a much more diverse and modern set of baskets instead.

Post-quantum standards will eventually replace older, more finicky algorithms with algorithms that don’t fall apart over the tiniest of subtleties. Several common sources of implementation error will be eliminated. Developers will be able to select algorithms to fit a broad range of use cases. The variety of new mathematical bases provides a β€œbackup plan” if a mathematical breakthrough renders one of the algorithms insecure. Post quantum algorithms aren’t a panacea, but they certainly treat a lot of the headaches we see at Trail of Bits.

Forget quantum computers, and look at post-quantum crypto research and standardization for what it is: a diversification and modernization effort.

❌
❌