Posted by Marbenz Antonio on June 22, 2022
Many of our present cryptographic techniques can be broken by a new sort of computer that is being created. As a result, we must create new algorithms that are secure against such machines while yet running on our present systems. This is known as “post-quantum cryptography.”
Richard Feynman proposed a new way of analyzing quantum interactions in complicated systems in 1981. When modeling these interactions, however, there is a challenge in that we must represent each connected particle as a set of probabilities. These arrays increase exponentially as we add particles. Existing computers can no longer match the storage and time needs of any sufficiently large system.
Feynman’s solution is straightforward: construct a computer that uses entangled quantum particles to mimic the physical item of interest. A computer of this type might efficiently perform a variety of tasks, including determining how to take advantage of changing entangled quantum states.
The concept behind a quantum computer is to replace traditional bits with “qubits.” Classical bits may only be 0 or 1, whereas a qubit has a chance of being 1 or 0, which is commonly represented as a unit vector in three-dimensional space. The qubit’s power comes from many bits that are entangled with one another. You can force these bits to take on the state of your answer quickly if you can create an algorithm in which these qubits interfere with each other in the solution to your problem.
When Feynman suggested quantum computers, such a machine was beyond anyone’s abilities to create, but researchers looked into not just how such a computer could be built, but also how it could be used.
Peter Shor discovered an algorithm in 1994 that might be used with a quantum computer to defeat the RSA and Diffie Hellman encryption systems. Shor’s approach was later modified to defeat ECC (Elliptic Curve Cryptography). These algorithms serve as the foundation for all of our public key exchange and digital signature algorithms. We understood from then on that our major public-key systems were only secure until someone created a sufficiently massive, functional quantum computer.
With current classical algorithms possibly compromised, we will require new algorithms based on challenges that are difficult to solve with quantum computers, which is where post-quantum cryptography comes in. These algorithms operate on conventional computers and are based on issues that neither a classical nor a quantum computer can solve.
Cryptography is common in today’s environment. When you input your credit card number on the web, it is protected by an encrypted channel that depends on both a digital signature (to ensure that you are sending the credit card to the proper vendor) and a public key exchange (to agree on a set of keys used between client and server to encrypt your communication). If a sufficiently massive quantum computer is created, none of the security guarantees provided by Transport Layer Security (TLS) can be relied on. Some disk encryption methods additionally employ public keys to enable recovery solutions if your users forget their passwords.
One of the most important implications for the common individual is that they must recognize the systems they utilize that may be susceptible. This is very important in business IT.
Every post-quantum cryptography presentation and publication addresses this question with Mosca’s Theorem:
If the total the time required to migrate to the new method (y) and the time required to keep the secret (x) exceeds the time remaining before we have a quantum computer capable of breaking our public key algorithm (z), your data will be compromised before its usefulness expires. The issue is that many of these figures are subject to doubt.
The duration required to maintain the secret (x) is generally determined by the application. For your credit card on the internet, for example, this may be two or three years, depending on the expiration date of your card. In the case of medical data, though, it may take decades.
This is complicated further by the fact that certain organizations (both public and private) have begun recording TLS sessions, so even if the TLS session is brief, the data in that connection may be retained and decrypted in the future. So, if you’re conducting assistance work in an authoritarian nation and dealing with people who may be incarcerated for cooperating with you, you probably don’t want to trust sending their names or other identifying information through a VPN or TLS connection. You have control over this time value, but you must consider how long you need to keep specific data hidden and from what kind of entities.
The deployment process can be time-consuming (y), beginning with standards and progressing to actual deployment. It might be decades in certain cases. For years, the cryptography community has been working on new standards proposals, which are only now expected to be standardized. The only control you have here is deciding how soon to implement algorithms and protocols on your systems once they are ready.
The greatest uncertainty at this moment is when we will have quantum computers capable of breaking our present algorithms (z). In 2015, Michael Mosca, a quantum computing researcher at the University of Waterloo, calculated that 2048-bit RSA will be susceptible by 2026, with a 50% chance of being vulnerable by 2031. In 2017, he revised it to a 1/6 probability that 2048 will be compromised by 2027. The development of quantum computers has accelerated, with firms such as IBM and Google creating tiny, experimental quantum computers to help address some of the most difficult challenges that they and their customers have.
The bottom line is that there are some things you should be concerned about right now, but others you shouldn’t until we have post-quantum algorithms in place.
The cryptographic community has been aware of these concerns for some time. The good news is that various new algorithms are available to replace our present key exchange and signature methods. The bad news is that all well-studied methods have significant deployment hurdles, particularly big key sizes or encrypted data/signature sizes, or both (In some cases megabits large). The community has spent the last decade investigating some of the more promising new algorithms that do not rely on huge keys and data blobs. In future postings, I’ll go into these families in greater detail, but for now, just a few facts.
The National Institute of Standards and Technology (NIST) launched the Post-Quantum Cryptography Standardization process in 2016 to gather and assess possible algorithms. They got 82 entries, with 69 deemed complete by the end of 2017. These were reviewed in 2018, and 30 algorithms were chosen at the start of this year for further improvement and assessment through 2019.
In that time frame, some of the other initial 69 algorithms were broken, and 26 of the original 69 algorithms were chosen for round two. NIST chose seven finalists and eight alternates for 2020. Three of these 15 algorithms have since been broken. The number of faulty algorithms demonstrates the value of moving slowly.
We anticipate that NIST will reach a final decision in 2022 and will then begin the standardization process. After that, protocols such as TLS may pick them up and suppliers can begin to install them.
You should first identify any current use of possibly flawed algorithms and verify whether they are protecting long-term data. This suggests you should reconsider your usage of:
You want to: