close icon
Cryptography

Post-Quantum Cryptography: Preparing for the Future of Security

Quantum computers may not be here yet, but we still need to prepare for them.

March 27, 2024

There’s a joke about Quantum Computing where a Computer Science Professor predicts to a rapt classroom that it’s just 10 years away, and the punchline is that they’ve been saying it for the last 50 years or so. The actual number of years in the future changes from telling to telling, but the point is always that it remains within sight but out of reach. Quantum Computing as a usable tool is definitely still out of reach, but you wouldn’t know it by the amount of discussion there is around it. We’re not going to go into the basics of Quantum Computing as there are plenty of articles already about that.

Instead, we’re going to focus on how we actively need to prepare for it, whether it will be a reality soon or not. Post-Quantum Cryptography (PQC) refers to cryptographic algorithms that can be run on classical computers, but are believed to be secure enough to withstand a cryptoanalytic attack by a Quantum Computer. Don't get me wrong, I’m not saying that you need to go and ditch your brand-new laptop straight away. When they do reach maturity, Quantum computers probably aren't going to be relevant for every use case out there. The ones we have just now can’t even run that ultimate test of computing power, Doom. Seriously.

How Quantum Computing Impacts Security

However, Quantum Computing is of particular concern when it comes to cryptography, specifically end-to-end encryption. This is where the information is encrypted on the sender's device before transmission and is only decrypted when it reaches the intended receiver's device. This is how we use cryptography every day, in more ways than you can imagine. It’s not just when you log in to your computer to check your bank account; it’s when you message a friend to arrange to meet for lunch, it’s when you tap your card to pay for that lunch, and it’s when you’re reading this article. Take a look at the browser bar, and chances are you will see ‘https’ at the start of the web address rather than just ‘http’. That ‘s’ at the end means that no one online can tamper with what you are reading. The web page is being sent securely to your device via a protocol called TLS, encrypting all communication between your computer and this site.

So why is it such a concern for end-to-end encryption? Well Public Key Cryptography or Asymmetric Cryptography is built around the idea of certain problems being hard to solve, and one of these is what’s called the Integer Factorization Problem. Now, this is a bit counterintuitive but it’s based on the premise that while two numbers are very easy to multiply together, it is surprisingly difficult to divide that result back into its prime factors without first knowing them. The difficulty in calculating the two factors in computing terms equates to Computational Hardness. And Public Key Cryptography is a big part of end-to-end encryption.

Public Key Cryptography is where a publicly available key is used to encrypt a message and a privately stored key is used to decrypt it. The maths behind it is that this is what is called a one-way function, i.e., very easy to compute in one direction but substantially more difficult to reverse, provided you pick two sufficiently large prime numbers. Examples usually revolve around RSA as it is relatively easy to explain and understand, but the more common Public Key algorithm used is Elliptic Curve, which is built around something called the Discrete Logarithm Problem. At a low level, it’s possible to prove that the Discrete Logarithm Problem and Integer Factorization Problem can be mathematically reduced to the same problem, but for the moment it’s best to think of RSA being based on the Integer Factorization Problem and Elliptic Curves being based on the Discrete Logarithm Problem.

So, how does quantum computing fit into this? Well, one of the earliest quantum algorithms is called Shor's Algorithm and is designed to, well, efficiently divide a number back into its prime factors. In real-world terms, Shor’s Algorithm is currently a while away from meeting this goal in any meaningful manner. The largest number reliably factored is 21 (that’s 7 x 3 for anyone needing a hand) back in 2012, but when the infrastructure exists to support fully developed implementations, then it’s not too hyperbolic to say that it will be a bit of a game changer.

Grover's Algorithm is a search algorithm that does the same for symmetric encryption but the solution is much simpler, which we’ll go into a little later on.

Quantum Computing

One of the driving forces behind developing Post-Quantum Cryptography, or PQC, is the concern that with storage currently being so cheap, it is easy for criminals to stockpile data harvested from security breaches in preparation for when Quantum Computers are available, an approach known as Harvest Now, Decrypt Later, or Retrospective Decryption. Whilst some would argue against preparing for a technology that is a while away from maturity, these attacks are certainly technically feasible. All they involve is the intercepting and (more importantly) storing of a large amount of data that will still have value in 5/10/20 years time. It might not sound plausible that criminals would be that far thinking, however given the longevity of identifiable information, it's easy to see that captured data may still have value, and with storage being very affordable, it doesn’t require too much outlay on the chance that it will. Some predict that Quantum computers will be able to break classical encryption by 2030. Think back to where you were in 2018. Six years isn’t that long, is it? So, ultimately, the concern has changed: it’s not just what is to be kept secret, it’s also how long it needs to stay secret.

Post-Quantum Cryptography Solutions

No great change is required to the core of asymmetric cryptography in order to make it Quantum secure. It’s still based on the difficulty of solving certain mathematical problems. So, if we want to make an encryption algorithm Quantum secure, we must ensure that the problem it is built around is sufficiently hard. This has a knock-on effect on verifying the security of the algorithms, as cryptography algorithms are validated by cryptographers rather than mechanical means, and the hardness of the problems means that there are fewer cryptographers qualified to do so.

There are six different approaches currently being worked on in Post-Quantum Cryptography:

  • Lattice-based cryptography
  • Multivariate cryptography
  • Hash-based cryptography
  • Code-based cryptography
  • Isogeny-based cryptography
  • Symmetric key Quantum resistance

Out of all of them, symmetric keys are the easiest to harden for Quantum Computing attacks, relatively speaking. In fact, in order to meet the same level of security strength (based on computational hardness) for a Post-Quantum algorithm as for a classical one, the key size can simply be doubled. This is because the computational hardness of symmetric key algorithms is based on the size of the keyspace rather than the difficulty of any problem to solve. So, to meet the same level of security currently offered by AES-256, you will need to use 512-bit keys.

Lattices are a mathematical construct based around matrices, which are arrangements of numbers in rows and columns. One of the more popular algorithms is called Learning With Errors, where the Error (a row of values) is added to the result of two matrices, providing the ‘difficulty’ in retrieving the result from the first factor. This can be represented formulaically as ax+e = b, where a and b make up the public key, and x and e make up the private key.

Hash-based cryptography involves using cryptographically secure hash functions to generate a private and public key for signing and is considered one of the more fully understood approaches. Lattice-based cryptography and Hash-based approaches have proved to be the most fertile ground for asymmetric solutions, leading to them making up the majority of the submissions for standardization by NIST.

NIST Recommendations

The body that decides which encryption algorithms become standards isn't the NSA or one of the other three-letter agencies you might think of. Instead, it’s the National Institute for Standards in Technology, a subsection of the U.S. Department of Commerce whose job it is to cover both the algorithms and the key strengths recommended for guaranteeing high security. Established in 1901 as the National Bureau of Standards, NIST is responsible for fostering technological innovation and economic competitiveness in the United States. They first became part of the cryptography landscape as part of the first Government project to develop a publicly available cryptographic standard to satisfy a broad range of requirements and use cases, rather than leaving it to the military as had previously been the case.

An open call was made in the early 1970s for people and companies to submit candidate algorithms, and after receiving a submission from IBM called Lucifer, NIST chose that algorithm as the basis for its proposed Data Encryption Standard (DES). Offering only 56 bits of encryption, it wasn’t long before this proved to be wanting in overall security terms, as the hardware needed to crack DES encryption started to become less cost-prohibitive as time went on, and several public competitions were held in the late 90’s with the express goal of breaking DES. It was reconfigured as Triple DES in an attempt to extend its shelf life (literally just running the DES algorithm three times to give 112 bits of security), but it soon became clear that a new standard was required. This was followed up with the replacement, Advanced Encryption Standard (AES) in 1997, an implementation of the Rijndael Encryption Algorithm, selected from a list of viable candidates from another public competition.

Data Encryption Standard

Encryption doesn’t stand still though, so NIST needs to keep updating its standards to meet the evolving security requirements. Since 2016, they have been running rounds of evaluation as part of their PQC Standardization Process, asking cryptographers to devise and vet encryption algorithms that could resist attacks from a Quantum computer. This has been done in the open to allay any concerns of the algorithm being tampered with, such as in the case of Dual EC DRBG, a random number generator that was believed to have a backdoor inserted by the NSA.

Nist started with 82 candidate algorithms nominated. 69 of these were chosen to progress, and through four rounds of judging this was whittled down to the four current standardized algorithms in 2022.

CRYSTALS-Kyber

Renamed by NIST as the Module Lattice-Based Key Encapsulation Mechanism (ML-KEM), this is based on the Learning With Errors (LWE) Lattice problem and is intended to be used for general encryption. It replaces Quantum vulnerable algorithms such as Elliptic Curve Diffie-Hellman (ECDH), i.e., any which use a public/private key pair to share a key that can then be used for symmetric encryption.

There are three parameter sets defined by the standard:

  • Kyber-512, offering security equivalent to AES-128
  • Kyber-768, offering security equivalent to AES-192
  • Kyber-1024, offering security equivalent to AES-256

CRYSTALS-Dilithium

Renamed by NIST as Module Lattice-based Digital Signature Algorithm (ML-DSA), this is a digital signature intended to offer a Quantum-secure replacement for ECDSA and RSA, i.e., any use case which involves using a public/private key pair to authenticate data.

There are three parameter sets defined by the standard:

  • ML-DSA-44, offering security equivalent to SHA3-256
  • ML-DSA-65, offering security equivalent to AES-192
  • ML-DSA-87, offering security equivalent to AES-256

SPHINCS+

Renamed by NIST as Stateless Hash-based Digital Signature Algorithm (SLH-DSA). Unlike all the other algorithms, this is the only non-Lattice-based solution and is instead a Hash-based approach.

There are twelve parameter sets defined by the standard. These are based on using either SHA2 or SHAKE as the hash algorithm and 128, 192, or 256 bits of security, eg SLH-DSA-SHA2-128s uses the SHA2 hash algorithm and offers security equivalent to AES-128.

FALCON

This is a Lattice-based solution but is still to receive a draft standard. This is taking longer due to the difficulty of implementing it, but it is due later this year (2024).

Three additional algorithms are still under consideration: BIKE, Classic McEliece, and HQC, all code-based cryptography. The popular SIKE (Supersingular Isogeny Diffie–Hellman Key Exchange) was originally in this list but was found to be classically broken by a group of Belgian researchers, who were able to recover a key within 62 minutes running on an Intel Xeon computer. NIST unsurprisingly decided not to standardize SIKE. This was built around the Supersingular Isogenies Problem, which seems to have been enough to dissuade further development around this particular problem.

Conclusion

Hopefully, you’re all a little clearer by now on why this is important and also why it has taken a long time to get here. Taking the joke from the start of this post as a warning, the U.S. is aiming to transition its cryptographic systems to Quantum-resistant algorithms by 2035, and recently, both Apple iMessage and Signal have upgraded their protocols to incorporate the CRYSTALS-Kyber algorithm and become Quantum-resistant. So, while this is a complex area and there are too few people around who understand it, the progress made is promising, and the joke is slowly becoming a reality.

References

  • Twitter icon
  • LinkedIn icon
  • Faceboook icon