**Understanding the Quantum Threat: Chapter 3**

In our previous blog post, we discussed the importance of updating to quantum-safe cryptography. One question that people often ask is: “How does the post-quantum cryptography that is being standardized by the National Institute of Standards and Technology (NIST) differ from the cryptography we use now?

**Cryptography uses Underlying Hard Problems**

When we use a cryptographic algorithm, there is an underlying hard problem. Factoring integers is an example of a hard problem. Generally speaking, it takes far less computational power to multiply numbers than it does to split a number into prime factors. It is possible to pick two prime numbers on the threshold, where a computer can easily multiply numbers that large, but no existing computer is powerful enough to split a number that is the size of the product back into the two original numbers. Moreover, it is possible to use this hard problem to build a cryptographic algorithm where the person encrypting can easily multiply two large numbers, but the adversary cannot figure out from the product what those original numbers are. This happens in the case of the RSA, an algorithm that can be used to encrypt communications. RSA is secure, because to the best of our knowledge, factoring integers is generally an extremely difficult problem.

However, although extremely unlikely, someone may find a new mathematical algorithm that could quickly and efficiently factor large numbers —effectively breaking RSA. In this case, any communication that relied on RSA would become compromised.

**Shor’s Algorithm can Factor Numbers, but Not Yet**

This is where the quantum threat comes in. In 1994, a mathematician, Peter Shor, found a new way to factor numbers, except luckily, his method could only be run on an exceptionally large quantum computer. At the time, the development of a quantum computer of any size was hardly conceivable, let alone a very large one. Since then, researchers have built quantum computers and recently, the power of these new quantum computers is beginning to rival the most powerful supercomputers for specific problems. As discussed in the first article of this series, *When Will Cryptographically Relevant Quantum Computer Exist?*, it looks like quantum computers that are powerful enough to run Shor’s algorithm and break RSA will exist within 8 to 15 years.

It is not just RSA either. Shor’s algorithm can also be used to break other commonly used algorithms, like ECDH, and ECDSA. In fact, all the standardized algorithms within an entire branch of cryptography, namely all public-key algorithms, will become vulnerable once these very powerful quantum computers are built. We need replacement algorithms, which cannot be broken even if powerful quantum computers exist. This is where we turn back to math.

**Quantum-Safe Cryptography uses Different Underlying Hard Problems**

There are plenty of problems out there that mathematicians believe are hard to compute. There are problems where it is easy to do a calculation, but very difficult to undo the calculation. Some of these mathematical problems can be used to create cryptographic algorithms, which are secure so long as the underlying problem is hard. ** Post-Quantum cryptography** (PQC), also called

*quantum-safe cryptography*, includes algorithms whose security is based on a problem that mathematics believe is hard regardless of whether or not a large-scale quantum computer exists.

The hard problems used in PQC come from different areas of math. For those of you familiar with mathematical lattices, there is an area of PQC that uses them called *lattice-based* cryptography. The underlying hard problems with lattice-based cryptography are different from the classical ones. The other main areas that are quantum-safe are *hash-based *and* code-based cryptography*.

**A Full Standardization Process with Community Involvement**

While some of these areas of PQC are as old as those of the classical areas of cryptography that are currently being used, some of the quantum-safe algorithms are indeed newer. To address the security concerns, the National Institute of Standards and Technology (NIST) has been running a public PQC standardization process that has already lasted 6 years. The idea was to have cryptographers from academia and industry from around the world scrutinize these algorithms for their security and practicality.

The standardization process is scheduled to release a first batch of standards of cryptographic algorithms by early 2024. While there is no silver bullet to guarantee cryptographic security, this security process for PQC is similar to the process that other cryptographic algorithms went through during their standardization. Finding a vulnerability years after an algorithm has been standardized is always possible. To avoid that unlikely event, it is reasonable to use a hybrid of classical and PQC algorithms to further guarantee security; the idea being that the attacker cannot break the encryption without breaking both the classical and PQC algorithms.

**Many Diverse Areas**

One interesting point about PQC is that unlike previously standardized public-key cryptography, the PQC algorithms come from many different areas of math. There are two main benefits of having multiple different areas of PQC:

- If a vulnerability is found in one area, then the other areas are still secure.

- The algorithms from different areas have different pros and cons. Some areas are more suited towards key exchanges, while others are more suited towards digital signatures. Some algorithms are faster, while others have smaller keys or signatures.

Overall, lattice-based cryptography is the most general-purpose area of PQC for the following reasons:

**Diversity:**There are efficient lattice-based key-exchange algorithms and lattice-based digital signatures algorithms. There are also high-powered cryptographic applications like Homomorphic Encryption and Identity-Based Cryptography that are lattice-based.

**Efficiency:**In many cases lattice-based algorithms are even faster than classical algorithms, thus reducing the barrier to a PQC migration.

**Reasonable Key Sizes:**While there are PQC that have smaller key sizes, the key sizes for lattices-based algorithms still makes them very usable.

Many of the other areas of PQC will have special use cases. To be able to switch between the areas of cryptography more easily, we suggest upgrading to a crypto-agile system during your PQC migrations. Contact us to discuss which type of PQC best fits your needs.