Post-Quantum Cryptography Standards Officially Announced by NIST – a History and Explanation

Share This Post

NIST has formally published three post-quantum cryptography standards from the competition it held to develop cryptography able to withstand the anticipated quantum computing decryption of current asymmetric encryption. 

There are no surprises – but now it is official. The three standards are ML-KEM (formerly better known as Kyber), ML-DSA (formerly better known as Dilithium), and SLH-DSA (better known as Sphincs+). A fourth, FN-DSA (known as Falcon) has been chosen for future standardization.

IBM, along with industry and academic partners, was involved in developing the first two. The third was co-developed by a researcher who has since joined IBM. IBM also worked with NIST in 2015/2016 to help establish the framework for the PQC competition that officially kicked off in December 2016. 

With such deep involvement in both the competition and winning algorithms, SecurityWeek talked to Michael Osborne, CTO of IBM Quantum Safe, for a better understanding of the need for and principles of quantum safe cryptography.

It has been understood since 1996 that a quantum computer would be able to decipher today’s RSA and elliptic curve algorithms using (Peter) Shor’s algorithm. But this was theoretical knowledge since the development of sufficiently powerful quantum computers was also theoretical. Shor’s algorithm could not be scientifically proven since there were no quantum computers to prove or disprove it. While security theories need to be monitored, only facts need to be handled.

“It was only when quantum machinery started to look more realistic and not just theoretic, around 2015-ish, that people such as the NSA in the US began to get a little concerned,” said Osborne. He explained that cybersecurity is fundamentally about risk. Although risk can be modeled in different ways, it is essentially about the probability and impact of a threat. In 2015, the probability of quantum decryption was still low but rising, while the potential impact had already risen so dramatically that the NSA began to be seriously concerned.

It was the increasing risk level combined with knowledge of how long it takes to develop and migrate cryptography in the business environment that created a sense of urgency and led to the new NIST competition. NIST already had some experience in the similar open competition that resulted in the Rijndael algorithm – a Belgian design submitted by Joan Daemen and Vincent Rijmen – becoming the AES symmetric cryptographic standard. Quantum-proof asymmetric algorithms would be more complex.

The first question to ask and answer is, why is PQC any more resistant to quantum mathematical decryption than pre-QC asymmetric algorithms? The answer is partly in the nature of quantum computers, and partly in the nature of the new algorithms. While quantum computers are massively more powerful than classical computers at solving some problems, they are not so good at others.

For example, while they will easily be able to decrypt current factoring and discrete logarithm problems, they will not so easily – if at all – be able to decrypt symmetric encryption. There is no current perceived necessity to replace AES.

Advertisement. Scroll to continue reading.

Both pre- and post-QC are based on difficult mathematical problems. Current asymmetric algorithms rely on the mathematical difficulty of factoring large numbers or solving the discrete logarithm problem. This difficulty can be overcome by the huge compute power of quantum computers.

PQC, however, tends to rely on a different set of problems associated with lattices. Without going into the math detail, consider one such problem – known as the ‘shortest vector problem’. If you think of the lattice as a grid, vectors are points on that grid. Finding the shortest route from the source to a specified vector sounds simple, but when the grid becomes a multi-dimensional grid, finding this route becomes an almost intractable problem even for quantum computers.

Within this concept, a public key can be derived from the core lattice with additional mathematic ‘noise’. The private key is mathematically related to the public key but with additional secret information. “We don’t see any good way in which quantum computers can attack algorithms based on lattices,” said Osborne.

That’s for now, and that’s for our current view of quantum computers. But we thought the same with factorization and classical computers – and then along came quantum. We asked Osborne if there are future possible technological advances that might blindside us again in the future.

“The thing we worry about right now,” he said, “is AI. If it continues its current trajectory toward General Artificial Intelligence, and it ends up understanding mathematics better than humans do, it may be able to discover new shortcuts to decryption. We are also concerned about very clever attacks, such as side-channel attacks. A slightly more distant threat could potentially come from in-memory computation and maybe neuromorphic computing.”

Neuromorphic chips – also known as the cognitive computer – hardwire AI and machine learning algorithms into an integrated circuit. They are designed to operate more like a human brain than does the standard sequential von Neumann logic of classical computers. They are also inherently capable of in-memory processing, providing two of Osborne’s decryption ‘concerns’: AI and in-memory processing.

“Optical computation [also known as photonic computing] is also worth watching,” he continued. Rather than using electrical currents, optical computation leverages the properties of light. Since the speed of the latter is far greater than the former, optical computation provides the potential for significantly faster processing. Other properties such as lower power consumption and less heat generation may also become more important in the future.

So, while we are confident that quantum computers will be able to decrypt current asymmetrical encryption in the relatively near future, there are several other technologies that could perhaps do the same. Quantum provides the greater risk: the impact will be similar for any technology that can provide asymmetric algorithm decryption; but the probability of quantum computing doing so is perhaps sooner and greater than we generally realize. 

It is worth noting, of course, that lattice-based algorithms will be harder to decrypt regardless of the technology being used.

IBM’s own Quantum Development Roadmap projects the company’s first error-corrected quantum system by 2029, and a system capable of running more than one billion quantum operations by 2033.

Interestingly, it is noticeable that there is no mention of when a cryptanalytically relevant quantum computer (CRQC) might emerge. There are two possible reasons. Firstly, asymmetric decryption is just a distressing by-product – it’s not what is driving quantum development. And secondly, nobody really knows: there are too many variables involved for anyone to make such a prediction.

We asked Duncan Jones, head of cybersecurity at Quantinuum, to elaborate. “There are three issues that interweave,” he explained. “The first is that the raw power of quantum computers being developed keeps changing pace. The second is rapid, but not consistent improvement, in error correction methods.” 

Quantum is inherently unstable and requires massive error correction to produce trustworthy results. This, currently, requires a huge number of additional qubits. Put simply neither the power of coming quantum, nor the efficiency of error correction algorithms can be precisely predicted.

“The third issue,” continued Jones, “is the decryption algorithm. Quantum algorithms are not simple to develop. And while we have Shor’s algorithm, it’s not as if there is just one version of that. People have tried optimizing it in different ways. It could be in a way that requires fewer qubits but a longer running time. Or the opposite can also be true. Or there could be a different algorithm. So, all the goal posts are moving, and it would take a brave person to put a specific prediction out there.”

Nobody expects any encryption to stand forever. Whatever we use will be broken. However, the uncertainty over when, how and how often future encryption will be cracked leads us to an important part of NIST’s recommendations: crypto agility. This is the ability to rapidly switch from one (broken) algorithm to another (believed to be secure) algorithm without requiring major infrastructure changes.

The risk equation of probability and impact is worsening. NIST has provided a solution with its PQC algorithms plus agility.

The last question we need to consider is whether we are solving a problem with PQC and agility, or merely shunting it down the road. The probability that current asymmetric encryption can be decrypted at scale and speed is rising; but the possibility that some adversarial nation can already do so also exists. The impact will be an almost total loss of faith in the internet, and the loss of all intellectual property that has already been stolen by adversaries. This can only be prevented by migrating to PQC as soon as possible. However, all IP already stolen will be lost. 

Since the new PQC algorithms will also eventually be broken, does migration solve the problem or simply exchange the old problem for a new one?

“I hear this a lot,” said Osborne, “but I look at it like this… If we were worried about things like that 40 years ago, we wouldn’t have the internet we have today. If we were worried that Diffie-Hellman and RSA didn’t provide absolute guaranteed security in perpetuity, we wouldn’t have today’s digital economy. We would have none of this,” he said.

The real question is whether we get enough security. The only guaranteed ‘encryption’ technology is the one-time pad – but that is unworkable in a business setting because it requires a key effectively as long as the message. The primary purpose of modern encryption algorithms is to reduce the size of required keys to a manageable length. So, given that absolute security is impossible in a workable digital economy, the real question is not are we secure, but are we secure enough?

“Absolute security is not the goal,” continued Osborne. “At the end of the day, security is like an insurance; and like any insurance we need to be certain that the premiums we pay are not more expensive than the cost of a failure. This is why a lot of security that could be used by banks is not used – the cost of fraud is less than the cost of preventing that fraud.”

‘Secure enough’ equates to ‘as secure as possible’, within all the trade-offs required to maintain the digital economy. “You get this by having the best people look at the problem,” he continued. “This is something that NIST did very well with its competition. We had the world’s best people, the best cryptographers and the best mathematicians looking at the problem and developing new algorithms and trying to break them. So, I would say that short of getting the impossible, this is the best solution we’re going to get.”

Anyone who has been in this industry for more than 15 years will remember being told that current asymmetric encryption would be safe forever, or at least longer than the projected life of the universe or would require more energy to break than exists in the universe.

How naïve. That was on old technology. New technology changes the equation. PQC is the development of new cryptosystems to counter new capabilities from new technology – specifically quantum computers. 

Nobody expects PQC encryption algorithms to stand forever. The hope is only that they will last long enough to be worth the risk. That’s where agility comes in. It will provide the ability to switch in new algorithms as old ones fall, with far less trouble than we have had in the past. So, if we continue to monitor the new decryption threats, and research new math to counter those threats, we will be in a stronger position than we were.

That is the silver lining to quantum decryption – it has forced us to accept that no encryption can guarantee security; but it can be used to make data safe enough, for now, to be worth the risk.

The NIST competition and the new PQC algorithms combined with crypto-agility could be viewed as the first step on the ladder to more rapid but on-demand and continuous algorithm improvement. It is probably secure enough (for the immediate future at least), but it is almost certainly the best we are going to get.

Related: Post-Quantum Cryptography Firm PQShield Raises $37 Million

Related: Cyber Insights 2024: Quantum and the Cryptopocalypse

Related: Tech Giants Form Post-Quantum Cryptography Alliance

Related: US Government Publishes Guidance on Migrating to Post-Quantum Cryptography

This post was originally published on this site

More Articles

Article

Navigating SEC Regulations In Cybersecurity And Incident Response

Free video resource for cybersecurity professionals. As 2024 approaches, we all know how vital it is to keep up to date with regulatory changes that affect our work. We get it – it’s a lot to juggle, especially when you’re in the trenches working on an investigation, handling, and responding to incidents.