Recent from talks
Contribute something
Nothing was collected or created yet.
Post-quantum cryptography
View on WikipediaPost-quantum cryptography (PQC), sometimes referred to as quantum-proof, quantum-safe, or quantum-resistant, is the development of cryptographic algorithms (usually public-key algorithms) that are currently thought to be secure against a cryptanalytic attack by a quantum computer.[1] Most widely used public-key algorithms rely on the difficulty of one of three mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems could be easily solved on a sufficiently powerful quantum computer running Shor's algorithm[2][3] or possibly alternatives.[4][5]
As of 2025, quantum computers lack the processing power to break widely used cryptographic algorithms;[6] however, because of the length of time required for migration to quantum-safe cryptography, cryptographers are already designing new algorithms to prepare for Y2Q or Q-Day, the day when current algorithms will be vulnerable to quantum computing attacks. Mosca's theorem provides the risk analysis framework that helps organizations identify how quickly they need to start migrating.
Their work has gained attention from academics and industry through the PQCrypto conference series hosted since 2006, several workshops on Quantum Safe Cryptography hosted by the European Telecommunications Standards Institute (ETSI), and the Institute for Quantum Computing.[7][8][9] The rumoured existence of widespread harvest now, decrypt later programs has also been seen as a motivation for the early introduction of post-quantum algorithms, as data recorded now may still remain sensitive many years into the future.[10][11][12]
In contrast to the threat quantum computing poses to current public-key algorithms, most current symmetric cryptographic algorithms and hash functions are considered to be relatively secure against attacks by quantum computers.[3][13] While the quantum Grover's algorithm does speed up attacks against symmetric ciphers, doubling the key size can effectively counteract these attacks.[14] Thus post-quantum symmetric cryptography does not need to differ significantly from current symmetric cryptography.
In 2024, the U.S. National Institute of Standards and Technology (NIST) released final versions of its first three Post-Quantum Cryptography Standards.[15]
Preparation
[edit]Digital infrastructures require robust cybersecurity. Cryptographic systems are vital to protect the confidentiality and authenticity of data. Quantum computing will be a threat to many of the cryptographic algorithms used to achieve these protection goals. Data that is currently not quantum-safe, whether it is stored or transmitted, and that must remain confidential for a long time, may be compromised in the future by quantum computers (“store now, decrypt later” attacks). In addition, authenticity will also be jeopardised by quantum computers. The threat that quantum computing poses to cybersecurity can be countered by a timely, comprehensive and coordinated transition to post-quantum cryptography (PQC).[16][17]
Algorithms
[edit]Post-quantum cryptography research is mostly focused on six different approaches:[3][8]
Lattice-based cryptography
[edit]This approach includes cryptographic systems such as learning with errors, ring learning with errors (ring-LWE),[18][19][20] the ring learning with errors key exchange and the ring learning with errors signature, the older NTRU or GGH encryption schemes, and the newer NTRU signature and BLISS signatures.[21] Some of these schemes like NTRU encryption have been studied for many years without anyone finding a feasible attack. Others like the ring-LWE algorithms have proofs that their security reduces to a worst-case problem.[22] The Post-Quantum Cryptography Study Group sponsored by the European Commission suggested that the Stehle–Steinfeld variant of NTRU be studied for standardization rather than the NTRU algorithm.[23][24] At that time, NTRU was still patented. Studies have indicated that NTRU may have more secure properties than other lattice based algorithms.[25] Two lattice-based algorithms, Crystals-KYBER and Crystals-DILITHIUM were among the first post-quantum algorithms standardised by NIST.[26]
Multivariate cryptography
[edit]This includes cryptographic systems such as the Rainbow (Unbalanced Oil and Vinegar) scheme which is based on the difficulty of solving systems of multivariate equations. Various attempts to build secure multivariate equation encryption schemes have failed. However, multivariate signature schemes like Rainbow could provide the basis for a quantum secure digital signature.[27] The Rainbow Signature Scheme is patented (the patent expires in August 2029).
Hash-based cryptography
[edit]This includes cryptographic systems such as Lamport signatures, the Merkle signature scheme, the XMSS,[28] the SPHINCS,[29] and the WOTS schemes. Hash based digital signatures were invented in the late 1970s by Ralph Merkle and have been studied ever since as an interesting alternative to number-theoretic digital signatures like RSA and DSA. Their primary drawback is that for any hash-based public key, there is a limit on the number of signatures that can be signed using the corresponding set of private keys. This fact reduced interest in these signatures until interest was revived due to the desire for cryptography that was resistant to attack by quantum computers. There appear to be no patents on the Merkle signature scheme[citation needed] and there exist many non-patented hash functions that could be used with these schemes. The stateful hash-based signature scheme XMSS developed by a team of researchers under the direction of Johannes Buchmann is described in RFC 8391.[30]
Note that all the above schemes are one-time or bounded-time signatures, Moni Naor and Moti Yung invented UOWHF hashing in 1989 and designed a signature based on hashing (the Naor-Yung scheme)[31] which can be unlimited-time in use (the first such signature that does not require trapdoor properties).
Code-based cryptography
[edit]This includes cryptographic systems which rely on error-correcting codes, such as the McEliece and Niederreiter encryption algorithms and the related Courtois, Finiasz and Sendrier Signature scheme. The original McEliece signature using random Goppa codes has withstood scrutiny for over 40 years. However, many variants of the McEliece scheme, which seek to introduce more structure into the code used in order to reduce the size of the keys, have been shown to be insecure.[32] The Post-Quantum Cryptography Study Group sponsored by the European Commission has recommended the McEliece public key encryption system as a candidate for long term protection against attacks by quantum computers.[23] In 2025 NIST announced plans to standardize the code-based HQC encryption algorithm.[33]
Isogeny-based cryptography
[edit]These cryptographic systems rely on the properties of isogeny graphs of elliptic curves (and higher-dimensional abelian varieties) over finite fields, in particular supersingular isogeny graphs, to create cryptographic systems. Among the more well-known representatives of this field are the Diffie–Hellman-like key exchange CSIDH, which can serve as a straightforward quantum-resistant replacement for the Diffie–Hellman and elliptic curve Diffie–Hellman key-exchange methods that are in widespread use today,[34] and the signature scheme SQIsign which is based on the categorical equivalence between supersingular elliptic curves and maximal orders in particular types of quaternion algebras.[35] Another widely noticed construction, SIDH/SIKE, was spectacularly broken in 2022.[36] The attack is however specific to the SIDH/SIKE family of schemes and does not generalize to other isogeny-based constructions.[37]
Symmetric key quantum resistance
[edit]Provided one uses sufficiently large key sizes, the symmetric key cryptographic systems like AES and SNOW 3G are already resistant to attack by a quantum computer.[38] Further, key management systems and protocols that use symmetric key cryptography instead of public key cryptography like Kerberos and the 3GPP Mobile Network Authentication Structure are also inherently secure against attack by a quantum computer. Given its widespread deployment in the world already, some researchers recommend expanded use of Kerberos-like symmetric key management as an efficient way to get post-quantum cryptography today.[39]
Security reductions
[edit]In cryptography research, it is desirable to prove the equivalence of a cryptographic algorithm and a known hard mathematical problem. These proofs are often called "security reductions", and are used to demonstrate the difficulty of cracking the encryption algorithm. In other words, the security of a given cryptographic algorithm is reduced to the security of a known hard problem. Researchers are actively looking for security reductions in the prospects for post-quantum cryptography. Current results are given here:
Lattice-based cryptography – Ring-LWE Signature
[edit]In some versions of Ring-LWE there is a security reduction to the shortest-vector problem (SVP) in a lattice as a lower bound on the security. The SVP is known to be NP-hard.[22] Specific ring-LWE systems that have provable security reductions include a variant of Lyubashevsky's ring-LWE signatures defined in a paper by Güneysu, Lyubashevsky, and Pöppelmann.[19] The GLYPH signature scheme is a variant of the Güneysu, Lyubashevsky, and Pöppelmann (GLP) signature which takes into account research results that have come after the publication of the GLP signature in 2012. Another Ring-LWE signature is Ring-TESLA.[40] There also exists a "derandomized variant" of LWE, called Learning with Rounding (LWR), which yields "improved speedup (by eliminating sampling small errors from a Gaussian-like distribution with deterministic errors) and bandwidth".[41] While LWE utilizes the addition of a small error to conceal the lower bits, LWR utilizes rounding for the same purpose.
Lattice-based cryptography – NTRU, BLISS
[edit]The security of the NTRU encryption scheme and the BLISS[21] signature is believed to be related to, but not provably reducible to, the closest vector problem (CVP) in a lattice. The CVP is known to be NP-hard. The Post-Quantum Cryptography Study Group sponsored by the European Commission suggested that the Stehle–Steinfeld variant of NTRU, which does have a security reduction be studied for long term use instead of the original NTRU algorithm.[23]
Multivariate cryptography – Unbalanced oil and vinegar
[edit]Unbalanced Oil and Vinegar signature schemes are asymmetric cryptographic primitives based on multivariate polynomials over a finite field . Bulygin, Petzoldt and Buchmann have shown a reduction of generic multivariate quadratic UOV systems to the NP-Hard multivariate quadratic equation solving problem.[42]
Hash-based cryptography – Merkle signature scheme
[edit]In 2005, Luis Garcia proved that there was a security reduction of Merkle Hash Tree signatures to the security of the underlying hash function. Garcia showed in his paper that if computationally one-way hash functions exist then the Merkle Hash Tree signature is provably secure.[43]
Therefore, if one used a hash function with a provable reduction of security to a known hard problem one would have a provable security reduction of the Merkle tree signature to that known hard problem.[44]
The Post-Quantum Cryptography Study Group sponsored by the European Commission has recommended use of the Merkle signature scheme for long term security protection against quantum computers.[23]
Code-based cryptography – McEliece
[edit]The McEliece Encryption System has a security reduction to the syndrome decoding problem (SDP). The SDP is known to be NP-hard.[45] The Post-Quantum Cryptography Study Group sponsored by the European Commission has recommended the use of this cryptography for long term protection against attack by a quantum computer.[23]
Code-based cryptography – RLCE
[edit]In 2016, Wang proposed a random linear code encryption scheme RLCE[46] which is based on McEliece schemes. RLCE scheme can be constructed using any linear code such as Reed-Solomon code by inserting random columns in the underlying linear code generator matrix.
Supersingular elliptic curve isogeny cryptography
[edit]Security is related to the problem of constructing an isogeny between two supersingular curves with the same number of points. The most recent investigation of the difficulty of this problem is by Delfs and Galbraith indicates that this problem is as hard as the inventors of the key exchange suggest that it is.[47] There is no security reduction to a known NP-hard problem.
Comparison
[edit]One common characteristic of many post-quantum cryptography algorithms is that they require larger key sizes than commonly used "pre-quantum" public key algorithms. There are often tradeoffs to be made in key size, computational efficiency and ciphertext or signature size. The table lists some values for different schemes at a 128-bit post-quantum security level.
| Algorithm | Type | Public key | Private key | Signature |
|---|---|---|---|---|
| ML-DSA[48] | Lattice | 1,312 B | 2,560 B | 2,420 B |
| NTRU Encrypt[49] | Lattice | 766.25 B | 842.875 B | |
| Streamlined NTRU Prime[citation needed] | Lattice | 154 B | ||
| Rainbow[50] | Multivariate | 124 kB | 95 kB | |
| SPHINCS[29] | Hash Signature | 1 kB | 1 kB | 41 kB |
| SPHINCS+[51] | Hash Signature | 32 B | 64 B | 8 kB |
| BLISS-II | Lattice | 7 kB | 2 kB | 5 kB |
| GLP-Variant GLYPH Signature[19][52] | Ring-LWE | 2 kB | 0.4 kB | 1.8 kB |
| NewHope[53] | Ring-LWE | 2 kB | 2 kB | |
| Goppa-based McEliece[23] | Code-based | 1 MB | 11.5 kB | |
| Random Linear Code based encryption[54] | RLCE | 115 kB | 3 kB | |
| Quasi-cyclic MDPC-based McEliece[55] | Code-based | 1,232 B | 2,464 B | |
| SIDH[56] | Isogeny | 564 B | 48 B | |
| SIDH (compressed keys)[57] | Isogeny | 330 B | 48 B | |
| 3072-bit Discrete Log | not PQC | 384 B | 32 B | 96 B |
| 256-bit Elliptic Curve | not PQC | 32 B | 32 B | 65 B |
A practical consideration on a choice among post-quantum cryptographic algorithms is the effort required to send public keys over the internet. From this point of view, the Ring-LWE, NTRU, and SIDH algorithms provide key sizes conveniently under 1 kB, hash-signature public keys come in under 5 kB, and MDPC-based McEliece takes about 1 kB. On the other hand, Rainbow schemes require about 125 kB and Goppa-based McEliece requires a nearly 1 MB key.
Lattice-based cryptography – LWE key exchange and Ring-LWE key exchange
[edit]The fundamental idea of using LWE and Ring LWE for key exchange was proposed and filed at the University of Cincinnati in 2011 by Jintai Ding. The basic idea comes from the associativity of matrix multiplications, and the errors are used to provide the security. The paper[58] appeared in 2012 after a provisional patent application was filed in 2012.
In 2014, Peikert[59] presented a key transport scheme following the same basic idea of Ding's, where the new idea of sending additional 1 bit signal for rounding in Ding's construction is also utilized. For somewhat greater than 128 bits of security, Singh presents a set of parameters which have 6956-bit public keys for the Peikert's scheme.[60] The corresponding private key would be roughly 14,000 bits.
In 2015, an authenticated key exchange with provable forward security following the same basic idea of Ding's was presented at Eurocrypt 2015,[61] which is an extension of the HMQV[62] construction in Crypto2005. The parameters for different security levels from 80 bits to 350 bits, along with the corresponding key sizes are provided in the paper.[61]
Lattice-based cryptography – NTRU encryption
[edit]For 128 bits of security in NTRU, Hirschhorn, Hoffstein, Howgrave-Graham and Whyte, recommend using a public key represented as a degree 613 polynomial with coefficients This results in a public key of 6130 bits. The corresponding private key would be 6743 bits.[49]
Multivariate cryptography – Rainbow signature
[edit]For 128 bits of security and the smallest signature size in a Rainbow multivariate quadratic equation signature scheme, Petzoldt, Bulygin and Buchmann, recommend using equations in GF(31) with a public key size of just over 991,000 bits, a private key of just over 740,000 bits and digital signatures which are 424 bits in length.[50]
Hash-based cryptography – Merkle signature scheme
[edit]In order to get 128 bits of security for hash based signatures to sign 1 million messages using the fractal Merkle tree method of Naor Shenhav and Wool the public and private key sizes are roughly 36,000 bits in length.[63]
Code-based cryptography – McEliece
[edit]For 128 bits of security in a McEliece scheme, The European Commission's Post-Quantum Cryptography Study group recommends using a binary Goppa code of length at least n = 6960 and dimension at least k = 5413, and capable of correcting t = 119 errors. With these parameters the public key for the McEliece system will be a systematic generator matrix whose non-identity part takes k × (n − k) = 8373911 bits. The corresponding private key, which consists of the code support with n = 6960 elements from GF(213) and a generator polynomial of with t = 119 coefficients from GF(213), will be 92,027 bits in length.[23]
The group is also investigating the use of Quasi-cyclic MDPC codes of length at least n = 216 + 6 = 65542 and dimension at least k = 215 + 3 = 32771, and capable of correcting t = 264 errors. With these parameters the public key for the McEliece system will be the first row of a systematic generator matrix whose non-identity part takes k = 32771 bits. The private key, a quasi-cyclic parity-check matrix with d = 274 nonzero entries on a column (or twice as much on a row), takes no more than d × 16 = 4384 bits when represented as the coordinates of the nonzero entries on the first row.
Barreto et al. recommend using a binary Goppa code of length at least n = 3307 and dimension at least k = 2515, and capable of correcting t = 66 errors. With these parameters the public key for the McEliece system will be a systematic generator matrix whose non-identity part takes k × (n − k) = 1991880 bits.[64] The corresponding private key, which consists of the code support with n = 3307 elements from GF(212) and a generator polynomial of with t = 66 coefficients from GF(212), will be 40,476 bits in length.
Supersingular elliptic curve isogeny cryptography
[edit]For 128 bits of security in the supersingular isogeny Diffie–Hellman (SIDH) method, De Feo, Jao and Plut recommend using a supersingular curve modulo a 768-bit prime. If one uses elliptic curve point compression the public key will need to be no more than 8x768 or 6144 bits in length.[65] A March 2016 paper by authors Azarderakhsh, Jao, Kalach, Koziel, and Leonardi showed how to cut the number of bits transmitted in half, which was further improved by authors Costello, Jao, Longa, Naehrig, Renes and Urbanik resulting in a compressed-key version of the SIDH protocol with public keys only 2640 bits in size.[57] This makes the number of bits transmitted roughly equivalent to the non-quantum secure RSA and Diffie–Hellman at the same classical security level.[66]
Symmetric-key–based cryptography
[edit]As a general rule, for 128 bits of security in a symmetric-key–based system, one can safely use key sizes of 256 bits. The best quantum attack against arbitrary symmetric-key systems is an application of Grover's algorithm, which requires work proportional to the square root of the size of the key space. To transmit an encrypted key to a device that possesses the symmetric key necessary to decrypt that key requires roughly 256 bits as well. It is clear that symmetric-key systems offer the smallest key sizes for post-quantum cryptography.[citation needed]
Forward secrecy
[edit]A public-key system demonstrates a property referred to as perfect forward secrecy when it generates random public keys per session for the purposes of key agreement. This means that the compromise of one message cannot lead to the compromise of others, and also that there is not a single secret value which can lead to the compromise of multiple messages. Security experts recommend using cryptographic algorithms that support forward secrecy over those that do not.[67] The reason for this is that forward secrecy can protect against the compromise of long term private keys associated with public/private key pairs. This is viewed as a means of preventing mass surveillance by intelligence agencies.
Both the Ring-LWE key exchange and supersingular isogeny Diffie–Hellman (SIDH) key exchange can support forward secrecy in one exchange with the other party. Both the Ring-LWE and SIDH can also be used without forward secrecy by creating a variant of the classic ElGamal encryption variant of Diffie–Hellman.
The other algorithms in this article, such as NTRU, do not support forward secrecy as is.
Any authenticated public key encryption system can be used to build a key exchange with forward secrecy.[68]
Open Quantum Safe project
[edit]The Open Quantum Safe (OQS) project was started in late 2016 and has the goal of developing and prototyping quantum-resistant cryptography.[69][70] It aims to integrate current post-quantum schemes in one library: liboqs.[71] liboqs is an open source C library for quantum-resistant cryptographic algorithms. It initially focuses on key exchange algorithms but by now includes several signature schemes. It provides a common API suitable for post-quantum key exchange algorithms, and will collect together various implementations. liboqs will also include a test harness and benchmarking routines to compare performance of post-quantum implementations. Furthermore, OQS also provides integration of liboqs into OpenSSL.[72]
As of March 2023, the following key exchange algorithms are supported:[69]
As of August 2024, NIST has published 3 algorithms below as FIPS standards and the 4th is expected near end of the year:[73]
| Algorithm | Type |
|---|---|
| FIPS-203: CRYSTALS-Kyber | ML-KEM:[74] Module Learning With Error |
| Classic McEliece | goppa codes |
| BIKE[75] | codes |
| HQC[76][77] | codes |
| Frodo[78][79] | Learning with errors |
| NTRU[80] | Lattice-based cryptography |
| FIPS-204: CRYSTALS-Dilithium[81][82] | ML-DSA:[83] Module Short Integer Solution |
| FIPS-206: Falcon | FN-DSA:[84] Short Integer Solution |
| FIPS-205: SPHINCS+ | SLH-DSA:[85] hash based |
Older supported versions that have been removed because of the progression of the NIST Post-Quantum Cryptography Standardization Project are:
| Algorithm | Type |
|---|---|
| BCNS15[86] | Ring learning with errors key exchange |
| NewHope[87][53] | Ring learning with errors key exchange |
| SIDH[88][89] | Supersingular isogeny key exchange |
| McBits[90] | Error-correcting codes |
Implementation
[edit]One of the main challenges in post-quantum cryptography is considered to be the implementation of potentially quantum safe algorithms into existing systems. There are tests done, for example by Microsoft Research implementing PICNIC in a PKI using Hardware security modules.[91] Test implementations for Google's NewHope algorithm have also been done by HSM vendors. In August 2023, Google released a FIDO2 security key implementation of an ECC/Dilithium hybrid signature schema which was done in partnership with ETH Zürich.[92]
The Signal Protocol uses Post-Quantum Extended Diffie–Hellman (PQXDH).[93]
On February 21, 2024, Apple announced that they were going to upgrade their iMessage protocol with a new PQC protocol called "PQ3", which will utilize ongoing keying.[94][95][96] Apple stated that, although quantum computers don't exist yet, they wanted to mitigate risks from future quantum computers as well as so-called "Harvest now, decrypt later" attack scenarios. Apple stated that they believe their PQ3 implementation provides protections that "surpass those in all other widely deployed messaging apps", because it utilizes ongoing keying. Apple intends to fully replace the existing iMessage protocol within all supported conversations with PQ3 by the end of 2024. Apple also defined a scale to make it easier to compare the security properties of messaging apps, with a scale represented by levels ranging from 0 to 3: 0 for no end-to-end by default, 1 for pre-quantum end-to-end by default, 2 for PQC key establishment only (e.g. PQXDH), and 3 for PQC key establishment and ongoing rekeying (PQ3).[94]
The Internet Engineering Task Force has prepared an Internet-Draft using PQC algorithms in Messaging Layer Security (MLS).[97] MLS will be used in RCS text messaging in Google Messages and Messages (Apple).
Other notable implementations include:
- bouncycastle[98]
- liboqs[99]
Hybrid encryption
[edit]
Google has maintained the use of "hybrid encryption" in its use of post-quantum cryptography: whenever a relatively new post-quantum scheme is used, it is combined with a more proven, non-PQ scheme. This is to ensure that the data are not compromised even if the relatively new PQ algorithm turns out to be vulnerable to non-quantum attacks before Y2Q. This type of scheme is used in its 2016 and 2019 tests for post-quantum TLS,[100] and in its 2023 FIDO2 key.[92] Indeed, one of the algorithms used in the 2019 test, SIKE, was broken in 2022, but the non-PQ X25519 layer (already used widely in TLS) still protected the data.[100] Apple's PQ3 and Signal's PQXDH are also hybrid.[94]
The NSA and GCHQ argues against hybrid encryption, claiming that it adds complexity to implementation and transition. Daniel J. Bernstein, who supports hybrid encryption, argues that the claims are bogus.[100]
See also
[edit]- NIST Post-Quantum Cryptography Standardization
- Quantum cryptography – cryptography based on quantum mechanics
- Crypto-shredding – Deleting encryption keys
References
[edit]- ^ "Post-Quantum Cryptography: A New Security Paradigm for the Post-Quantum Era". Penta Security Inc. 2025-06-05. Retrieved 2025-07-10.
- ^ Shor, Peter W. (1997). "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer". SIAM Journal on Computing. 26 (5): 1484–1509. arXiv:quant-ph/9508027. Bibcode:1995quant.ph..8027S. doi:10.1137/S0097539795293172. S2CID 2337707.
- ^ a b c Bernstein, Daniel J. (2009). "Introduction to post-quantum cryptography" (PDF). Post-Quantum Cryptography.
- ^ Kramer, Anna (2023). "'Surprising and super cool'. Quantum algorithm offers faster way to hack internet encryption". Science. 381 (6664): 1270. doi:10.1126/science.adk9443. PMID 37733849. S2CID 262084525.
- ^ Regev, Oded (2025-02-28). "An Efficient Quantum Factoring Algorithm". Journal of the ACM. 72 (1): 1–13. arXiv:2308.06572. doi:10.1145/3708471. ISSN 0004-5411.
- ^ Gershon, Eric (2013-01-14). "New qubit control bodes well for future of quantum computing". phys.org.
- ^ Heger, Monica (2009-01-01). "Cryptographers Take On Quantum Computers". IEEE Spectrum.
- ^ a b "Q&A With Post-Quantum Computing Cryptography Researcher Jintai Ding". IEEE Spectrum. 2008-11-01.
- ^ "ETSI Quantum Safe Cryptography Workshop". ETSI Quantum Safe Cryptography Workshop. ETSI. October 2014. Archived from the original on 17 August 2016. Retrieved 24 February 2015.
- ^ Gasser, Linus (2023), Mulder, Valentin; Mermoud, Alain; Lenders, Vincent; Tellenbach, Bernhard (eds.), "Post-quantum Cryptography", Trends in Data Protection and Encryption Technologies, Cham: Springer Nature Switzerland, pp. 47–52, doi:10.1007/978-3-031-33386-6_10, ISBN 978-3-031-33386-6
- ^ Townsend, Kevin (2022-02-16). "Solving the Quantum Decryption 'Harvest Now, Decrypt Later' Problem". SecurityWeek. Retrieved 2023-04-09.
- ^ "Quantum-Safe Secure Communications" (PDF). UK National Quantum Technologies Programme. October 2021. Retrieved 2023-04-09.
- ^ Daniel J. Bernstein (2009-05-17). "Cost analysis of hash collisions: Will quantum computers make SHARCS obsolete?" (PDF).
- ^ Daniel J. Bernstein (2010-03-03). "Grover vs. McEliece" (PDF).
- ^ NIST Releases First 3 Finalized Post-Quantum Encryption Standards, NIST, August 13, 2024
- ^ "A Coordinated Implementation Roadmap for the Transition to Post-Quantum Cryptography". European Union. 2025-06-23.
- ^ "The PQC Migration Handbook". General Intelligence and Security Service. 2024-12-01.
- ^ Peikert, Chris (2014), Mosca, Michele (ed.), "Lattice Cryptography for the Internet" (PDF), Post-Quantum Cryptography, Lecture Notes in Computer Science, vol. 8772, Cham: Springer International Publishing, pp. 197–219, doi:10.1007/978-3-319-11659-4_12, ISBN 978-3-319-11658-7, retrieved 2025-07-24
- ^ a b c Güneysu, Tim; Lyubashevsky, Vadim; Pöppelmann, Thomas (2012), Prouff, Emmanuel; Schaumont, Patrick (eds.), "Practical Lattice-Based Cryptography: A Signature Scheme for Embedded Systems" (PDF), Cryptographic Hardware and Embedded Systems – CHES 2012, vol. 7428, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 530–547, doi:10.1007/978-3-642-33027-8_31, ISBN 978-3-642-33026-1, retrieved 2025-07-24
- ^ Zhang, Jiang; Zhang, Zhenfeng; Ding, Jintai; Snook, Michael; Dagdelen, Özgür (2015), Oswald, Elisabeth; Fischlin, Marc (eds.), "Authenticated Key Exchange from Ideal Lattices" (PDF), Advances in Cryptology - EUROCRYPT 2015, vol. 9057, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 719–751, doi:10.1007/978-3-662-46803-6_24, ISBN 978-3-662-46802-9, retrieved 2025-07-24
- ^ a b Ducas, Léo; Durmus, Alain; Lepoint, Tancrède; Lyubashevsky, Vadim (2013), Canetti, Ran; Garay, Juan A. (eds.), "Lattice Signatures and Bimodal Gaussians" (PDF), Advances in Cryptology – CRYPTO 2013, vol. 8042, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 40–56, doi:10.1007/978-3-642-40041-4_3, ISBN 978-3-642-40040-7, retrieved 2025-07-24
- ^ a b Lyubashevsky, Vadim; Peikert, Chris; Regev, Oded (2010), Gilbert, Henri (ed.), "On Ideal Lattices and Learning with Errors over Rings" (PDF), Advances in Cryptology – EUROCRYPT 2010, vol. 6110, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 1–23, doi:10.1007/978-3-642-13190-5_1, ISBN 978-3-642-13189-9, retrieved 2025-07-24
- ^ a b c d e f g Augot, Daniel; Batina, Lejla; Bernstein, Daniel J.; Bos, Joppe; Buchmann, Johannes; Castryck, Wouter; Dunkelman, Orr; Güneysu, Tim; Gueron, Shay; Hülsing, Andreas; Lange, Tanja; Mohamed, Mohamed Saied Emam; Rechberger, Christian; Schwabe, Peter; Sendrier, Nicolas; Vercauteren, Frederik; Yang, Bo-Yin (7 September 2015). "Initial recommendations of long-term secure post-quantum systems" (PDF). PQCRYPTO. Retrieved 13 September 2015.
- ^ Stehlé, Damien; Steinfeld, Ron (2013-01-01). "Making NTRUEncrypt and NTRUSign as Secure as Standard Worst-Case Problems over Ideal Lattices". Cryptology ePrint Archive.
- ^ Easttom, Chuck (2019-02-01). "An Analysis of Leading Lattice-Based Asymmetric Cryptographic Primitives". 2019 IEEE 9th Annual Computing and Communication Workshop and Conference (CCWC). pp. 0811–0818. doi:10.1109/CCWC.2019.8666459. ISBN 978-1-7281-0554-3. S2CID 77376310.
- ^ "NIST Releases First 3 Finalized Post-Quantum Encryption Standards". National Institute of Standards and Technology (NIST). 13 August 2024.
- ^ Ding, Jintai; Schmidt, Dieter (7 June 2005). "Rainbow, a New Multivariable Polynomial Signature Scheme". In Ioannidis, John (ed.). Applied Cryptography and Network Security. Lecture Notes in Computer Science. Vol. 3531. pp. 64–175. doi:10.1007/11496137_12. ISBN 978-3-540-26223-7. S2CID 6571152.
- ^ Buchmann, Johannes; Dahmen, Erik; Hülsing, Andreas (2011). "XMSS – A Practical Forward Secure Signature Scheme Based on Minimal Security Assumptions". Post-Quantum Cryptography. PQCrypto 2011. Lecture Notes in Computer Science. Vol. 7071. pp. 117–129. CiteSeerX 10.1.1.400.6086. doi:10.1007/978-3-642-25405-5_8. ISBN 978-3-642-25404-8. ISSN 0302-9743.
- ^ a b Bernstein, Daniel J.; Hopwood, Daira; Hülsing, Andreas; Lange, Tanja; Niederhagen, Ruben; Papachristodoulou, Louiza; Schneider, Michael; Schwabe, Peter; Wilcox-O'Hearn, Zooko (2015). "SPHINCS: Practical Stateless Hash-Based Signatures". In Oswald, Elisabeth; Fischlin, Marc (eds.). Advances in Cryptology -- EUROCRYPT 2015. Lecture Notes in Computer Science. Vol. 9056. Springer Berlin Heidelberg. pp. 368–397. CiteSeerX 10.1.1.690.6403. doi:10.1007/978-3-662-46800-5_15. ISBN 9783662467992.
- ^ Huelsing, A.; Butin, D.; Gazdag, S.; Rijneveld, J.; Mohaisen, A. (2018). "RFC 8391 – XMSS: eXtended Merkle Signature Scheme". tools.ietf.org. doi:10.17487/RFC8391.
- ^ Naor, M.; Yung, M. (1989). Universal one-way hash functions and their cryptographic applications. ACM Press. pp. 33–43. doi:10.1145/73007.73011. ISBN 978-0-89791-307-2.
- ^ Overbeck, Raphael; Sendrier (2009). "Code-based cryptography". In Bernstein, Daniel (ed.). Post-Quantum Cryptography. pp. 95–145. doi:10.1007/978-3-540-88702-7_4. ISBN 978-3-540-88701-0.
- ^ "NIST Selects HQC as Fifth Algorithm for Post-Quantum Encryption". National Institute of Standards and Technology (NIST). 11 March 2025.
- ^ Castryck, Wouter; Lange, Tanja; Martindale, Chloe; Panny, Lorenz; Renes, Joost (2018). "CSIDH: An Efficient Post-Quantum Commutative Group Action". In Peyrin, Thomas; Galbraith, Steven (eds.). Advances in Cryptology – ASIACRYPT 2018. Lecture Notes in Computer Science. Vol. 11274. Cham: Springer International Publishing. pp. 395–427. doi:10.1007/978-3-030-03332-3_15. hdl:1854/LU-8619033. ISBN 978-3-030-03332-3. S2CID 44165584.
- ^ De Feo, Luca; Kohel, David; Leroux, Antonin; Petit, Christophe; Wesolowski, Benjamin (2020). "SQISign: Compact Post-quantum Signatures from Quaternions and Isogenies". In Moriai, Shiho; Wang, Huaxiong (eds.). Advances in Cryptology – ASIACRYPT 2020. Lecture Notes in Computer Science. Vol. 12491. Cham: Springer International Publishing. pp. 64–93. doi:10.1007/978-3-030-64837-4_3. hdl:2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/318983. ISBN 978-3-030-64837-4. ISSN 0302-9743. S2CID 222265162.
- ^ Castryck, Wouter; Decru, Thomas (2023), Hazay, Carmit; Stam, Martijn (eds.), "An Efficient Key Recovery Attack on SIDH", Advances in Cryptology – EUROCRYPT 2023, vol. 14008, Cham: Springer Nature Switzerland, pp. 423–447, doi:10.1007/978-3-031-30589-4_15, ISBN 978-3-031-30588-7, S2CID 258240788, retrieved 2023-06-21
- ^ "Is SIKE broken yet?". Retrieved 2023-06-23.
- ^ Perlner, Ray A.; Cooper, David A. (2009-04-14). Quantum resistant public key cryptography: a survey. ACM. pp. 85–93. doi:10.1145/1527017.1527028. ISBN 978-1-60558-474-4.
- ^ Campagna, Matt; Hardjono, Thomas; Pintsov, Leon; Romansky, Brian; Yu, Taylor (2013). "Kerberos Revisited Quantum-Safe Authentication" (PDF). ETSI.
- ^ Akleylek, Sedat; Bindel, Nina; Buchmann, Johannes; Krämer, Juliane; Marson, Giorgia Azzurra (2016), Pointcheval, David; Nitaj, Abderrahmane; Rachidi, Tajjeeddine (eds.), "An Efficient Lattice-Based Signature Scheme with Provably Secure Instantiation", Progress in Cryptology – AFRICACRYPT 2016, vol. 9646, Cham: Springer International Publishing, pp. 44–60, doi:10.1007/978-3-319-31517-1_3, ISBN 978-3-319-31516-4, retrieved 2025-07-27
- ^ Nejatollahi, Hamid; Dutt, Nikil; Ray, Sandip; Regazzoni, Francesco; Banerjee, Indranil; Cammarota, Rosario (2019-02-27). "Post-Quantum Lattice-Based Cryptography Implementations: A Survey". ACM Computing Surveys. 51 (6): 1–41. doi:10.1145/3292548. ISSN 0360-0300. S2CID 59337649.
- ^ Bulygin, Stanislav; Petzoldt; Buchmann (2010). "Towards Provable Security of the Unbalanced Oil and Vinegar Signature Scheme under Direct Attacks". Progress in Cryptology – INDOCRYPT 2010. Lecture Notes in Computer Science. Vol. 6498. pp. 17–32. CiteSeerX 10.1.1.294.3105. doi:10.1007/978-3-642-17401-8_3. ISBN 978-3-642-17400-1.
- ^ Pereira, Geovandro; Puodzius, Cassius; Barreto, Paulo (2016). "Shorter hash-based signatures". Journal of Systems and Software. 116: 95–100. doi:10.1016/j.jss.2015.07.007.
- ^ Garcia, Luis. "On the security and the efficiency of the Merkle signature scheme" (PDF). Cryptology ePrint Archive. IACR. Retrieved 19 June 2013.
- ^ Blaum, Mario; Farrell; Tilborg (31 May 2002). Information, Coding and Mathematics. Springer. doi:10.1007/978-1-4757-3585-7. ISBN 978-1-4757-3585-7.
- ^ Wang, Yongge (2016). "Quantum resistant random linear code based public key encryption scheme RLCE". 2016 IEEE International Symposium on Information Theory (ISIT). pp. 2519–2523. arXiv:1512.08454. Bibcode:2015arXiv151208454W. doi:10.1109/ISIT.2016.7541753. ISBN 978-1-5090-1806-2.
- ^ Delfs, Christina; Galbraith, Steven D. (February 2016). "Computing isogenies between supersingular elliptic curves over F_p". Designs, Codes and Cryptography. 78 (2): 425–440. arXiv:1310.7789. doi:10.1007/s10623-014-0010-1. ISSN 0925-1022.
- ^ National Institute of Standards and Technology (2024-08-13). Module-Lattice-Based Digital Signature Standard (PDF) (Report). Gaithersburg, MD: National Institute of Standards and Technology. doi:10.6028/nist.fips.204.
- ^ a b Hirschhorn, Philip S.; Hoffstein, Jeffrey; Howgrave-Graham, Nick; Whyte, William (2009), Abdalla, Michel; Pointcheval, David; Fouque, Pierre-Alain; Vergnaud, Damien (eds.), "Choosing NTRUEncrypt Parameters in Light of Combined Lattice Reduction and MITM Approaches", Applied Cryptography and Network Security, vol. 5536, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 437–455, doi:10.1007/978-3-642-01957-9_27, ISBN 978-3-642-01956-2, retrieved 2025-08-07
- ^ a b Petzoldt, Albrecht; Bulygin; Buchmann (2010). "Selecting Parameters for the Rainbow Signature Scheme – Extended Version -" (PDF). Archived (PDF) from the original on 4 March 2016. Retrieved 12 May 2014.
- ^ Bernstein, Daniel J.; Dobraunig, Christoph; Eichlseder, Maria; Fluhrer, Scott; Gazdag, Stefan-Lukas; Hülsing, Andreas; Kampanakis, Panos; Kölbl, Stefan; Lange, Tanja; Lauridsen, Martin M.; Mendel, Florian; Niederhagen, Ruben; Rechberger, Christian; Rijneveld, Joost; Schwabe, Peter (November 30, 2017). "SPHINCS+: Submission to the NIST post-quantum project" (PDF).
- ^ Chopra, Arjun (2017). "GLYPH: A New Insantiation of the GLP Digital Signature Scheme". Cryptology ePrint Archive.
- ^ a b Alkim, Erdem; Ducas, Léo; Pöppelmann, Thomas; Schwabe, Peter (2015). "Post-quantum key exchange – a new hope" (PDF). Cryptology ePrint Archive, Report 2015/1092. Retrieved 1 September 2017.
- ^ Wang, Yongge (2017). "Revised Quantum Resistant Public Key Encryption Scheme RLCE and IND-CCA2 Security for McEliece Schemes". Cryptology ePrint Archive.
- ^ Misoczki, R.; Tillich, J. P.; Sendrier, N.; Barreto, P. S. L. M. (2013). "MDPC-McEliece: New McEliece variants from Moderate Density Parity-Check codes". 2013 IEEE International Symposium on Information Theory. pp. 2069–2073. CiteSeerX 10.1.1.259.9109. doi:10.1109/ISIT.2013.6620590. ISBN 978-1-4799-0446-4. S2CID 9485532.
- ^ Costello, Craig; Longa, Patrick; Naehrig, Michael (2016). "Efficient Algorithms for Supersingular Isogeny Diffie–Hellman" (PDF). Advances in Cryptology – CRYPTO 2016. Lecture Notes in Computer Science. Vol. 9814. pp. 572–601. doi:10.1007/978-3-662-53018-4_21. ISBN 978-3-662-53017-7.
- ^ a b Costello, Craig; Jao; Longa; Naehrig; Renes; Urbanik. "Efficient Compression of SIDH public keys". Retrieved 8 October 2016.
- ^ Ding, Jintai; Xie, Xiang; Lin, Xiaodong (2012-01-01). "A Simple Provably Secure Key Exchange Scheme Based on the Learning with Errors Problem". Cryptology ePrint Archive.
- ^ Peikert, Chris (2014-01-01). "Lattice Cryptography for the Internet". Cryptology ePrint Archive.
- ^ Singh, Vikram (2015). "A Practical Key Exchange for the Internet using Lattice Cryptography". Cryptology ePrint Archive. Retrieved 2015-04-18.
- ^ a b Zhang, Jiang; Zhang, Zhenfeng; Ding, Jintai; Snook, Michael; Dagdelen, Özgür (2015-04-26). "Authenticated Key Exchange from Ideal Lattices". In Oswald, Elisabeth; Fischlin, Marc (eds.). Advances in Cryptology – EUROCRYPT 2015. Lecture Notes in Computer Science. Vol. 9057. Springer Berlin Heidelberg. pp. 719–751. CiteSeerX 10.1.1.649.1864. doi:10.1007/978-3-662-46803-6_24. ISBN 978-3-662-46802-9.
- ^ Krawczyk, Hugo (2005-08-14). "HMQV: A High-Performance Secure Diffie–Hellman Protocol". In Shoup, Victor (ed.). Advances in Cryptology – CRYPTO 2005. Lecture Notes in Computer Science. Vol. 3621. Springer. pp. 546–566. doi:10.1007/11535218_33. ISBN 978-3-540-28114-6.
- ^ Naor, Dalit; Shenhav, Amir; Wool, Avishai (November 2006). "One-Time Signatures Revisited: Practical Fast Signatures Using Fractal Merkle Tree Traversal". 2006 IEEE 24th Convention of Electrical & Electronics Engineers in Israel. IEEE. pp. 255–259. doi:10.1109/EEEI.2006.321066. ISBN 978-1-4244-0229-8.
- ^ Barreto, Paulo S. L. M.; Biasi, Felipe Piazza; Dahab, Ricardo; López-Hernández, Julio César; Morais, Eduardo M. de; Oliveira, Ana D. Salina de; Pereira, Geovandro C. C. F.; Ricardini, Jefferson E. (2014). Koç, Çetin Kaya (ed.). A Panorama of Post-quantum Cryptography. Springer International Publishing. pp. 387–439. doi:10.1007/978-3-319-10683-0_16. ISBN 978-3-319-10682-3.
- ^ De Feo, Luca; Jao; Plut (2011). "Towards Quantum-Resistant Cryptosystems From Supersingular Elliptic Curve Isogenies" (PDF). Archived (PDF) from the original on 11 February 2014. Retrieved 12 May 2014.
- ^ Azarderakhsh, Reza; Jao, David; Kalach, Kassem; Koziel, Brian; Leonardi, Christopher. "Key Compression for Isogeny-Based Cryptosystems". eprint.iacr.org. Retrieved 2016-03-02.
- ^ Ristic, Ivan (2013-06-25). "Deploying Forward Secrecy". SSL Labs. Retrieved 14 June 2014.
- ^ "Does NTRU provide Perfect Forward Secrecy?". crypto.stackexchange.com.
- ^ a b "Open Quantum Safe". openquantumsafe.org.
- ^ Stebila, Douglas; Mosca, Michele. "Post-Quantum Key Exchange for the Internet and the Open Quantum Safe Project". Cryptology ePrint Archive, Report 2016/1017, 2016. Retrieved 9 April 2017.
- ^ "liboqs: C library for quantum-resistant cryptographic algorithms". 26 November 2017 – via GitHub.
- ^ "oqsprovider: Open Quantum Safe provider for OpenSSL (3.x)". 12 August 2024 – via GitHub.
- ^ "NIST Releases First 3 Finalized Post-Quantum Encryption Standards". NIST. 13 August 2024.
- ^ "Module-Lattice-Based Key-Encapsulation Mechanism Standard". 2024. doi:10.6028/NIST.FIPS.203.
- ^ "BIKE – Bit Flipping Key Encapsulation". bikesuite.org. Retrieved 2023-08-21.
- ^ "HQC". pqc-hqc.org. Retrieved 2023-08-21.
- ^ "Fast and Efficient Hardware Implementation of HQC" (PDF).
- ^ Bos, Joppe; Costello, Craig; Ducas, Léo; Mironov, Ilya; Naehrig, Michael; Nikolaenko, Valeria; Raghunathan, Ananth; Stebila, Douglas (2016-01-01). "Frodo: Take off the ring! Practical, Quantum-Secure Key Exchange from LWE". Cryptology ePrint Archive.
- ^ "FrodoKEM". frodokem.org. Retrieved 2023-08-21.
- ^ "NTRUOpenSourceProject/NTRUEncrypt". GitHub. Retrieved 2017-04-10.
- ^ Schwabe, Peter. "Dilithium". pq-crystals.org. Retrieved 2023-08-19.
- ^ "Cryptographic Suite for Algebraic Lattices, Digital Signature: Dilithium" (PDF).
- ^ "Module-Lattice-Based Digital Signature Standard". 2024. doi:10.6028/NIST.FIPS.204.
- ^ "NIST Releases First 3 Finalized Post-Quantum Encryption Standards". NIST. 13 August 2024.
- ^ "Stateless Hash-Based Digital Signature Standard". 2024. doi:10.6028/NIST.FIPS.205.
- ^ Stebila, Douglas (26 Mar 2018). "liboqs nist-branch algorithm datasheet: kem_newhopenist". GitHub. Retrieved 27 September 2018.
- ^ "Lattice Cryptography Library". Microsoft Research. 19 Apr 2016. Retrieved 27 September 2018.
- ^ "SIDH Library – Microsoft Research". Microsoft Research. Retrieved 2017-04-10.
- ^ Feo, Luca De; Jao, David; Plût, Jérôme (2011-01-01). "Towards quantum-resistant cryptosystems from supersingular elliptic curve isogenies". Cryptology ePrint Archive. PQCrypto 2011. Archived from the original on 2014-05-03.
- ^ Bernstein, Daniel J.; Chou, Tung; Schwabe, Peter (2015-01-01). "McBits: fast constant-time code-based cryptography". Cryptology ePrint Archive.
- ^ "Microsoft/Picnic" (PDF). GitHub. Retrieved 2018-06-27.
- ^ a b "Toward Quantum Resilient Security Keys". Google Online Security Blog. Retrieved 2023-08-19.
- ^ Ehren Kret, Rolfe Schmidt (September 19, 2023). "Quantum Resistance and the Signal Protocol".
- ^ a b c
Apple Security Engineering and Architecture (SEAR) (February 21, 2024). "iMessage with PQ3: The new state of the art in quantum-secure messaging at scale". Apple Security Research. Apple Inc. Retrieved 2024-02-22.
With compromise-resilient encryption and extensive defenses against even highly sophisticated quantum attacks, PQ3 is the first messaging protocol to reach what we call Level 3 security — providing protocol protections that surpass those in all other widely deployed messaging apps.
- ^ Rossignoi, Joe (February 21, 2024). "Apple Announces 'Groundbreaking' New Security Protocol for iMessage". MacRumors. Retrieved 2024-02-22.
- ^ Potuck, Michael (February 21, 2024). "Apple launching quantum computer protection for iMessage with iOS 17.4, here's what that means". 9to5Mac. Retrieved 2024-02-22.
- ^ Mahy, Rohan; Barnes, Richard (2025-03-03). ML-KEM and Hybrid Cipher Suites for Messaging Layer Security (Report). Internet Engineering Task Force.
- ^ "Bouncy Castle Betas".
- ^ "Open Quantum Safe".
- ^ a b c Bernstein, Daniel J. (2024-01-02). "Double encryption: Analyzing the NSA/GCHQ arguments against hybrids. #nsa #quantification #risks #complexity #costs".
Further reading
[edit]- The PQXDH Key Agreement Protocol Specification
- Post-Quantum Cryptography. Springer. 2008. p. 245. ISBN 978-3-540-88701-0.
- Isogenies in a Quantum World Archived 2014-05-02 at the Wayback Machine
- On Ideal Lattices and Learning With Errors Over Rings
- Kerberos Revisited: Quantum-Safe Authentication
- The picnic signature scheme
- Buchmann, Johannes A.; Butin, Denis; Göpfert, Florian; Petzoldt, Albrecht (2016). "Post-Quantum Cryptography: State of the Art". The New Codebreakers: Essays Dedicated to David Kahn on the Occasion of His 85th Birthday. Springer. pp. 88–108. doi:10.1007/978-3-662-49301-4_6. ISBN 978-3-662-49301-4.
- Bernstein, Daniel J.; Lange, Tanja (2017). "Post-quantum cryptography". Nature. 549 (7671): 188–194. Bibcode:2017Natur.549..188B. doi:10.1038/nature23461. PMID 28905891.
- Kumar, Manoj; Pattnaik, Pratap (2020). "Post Quantum Cryptography(PQC) - an overview: (Invited Paper)". 2020 IEEE High Performance Extreme Computing Conference (HPEC). pp. 1–9. doi:10.1109/HPEC43674.2020.9286147. ISBN 978-1-7281-9219-2.
- Campagna, Matt; LaMacchia, Brian; Ott, David (2021). "Post Quantum Cryptography: Readiness Challenges and the Approaching Storm". Computing Community Consortium. arXiv:2101.01269.
- Yalamuri, Gagan; Honnavalli, Prasad; Eswaran, Sivaraman (2022). "A Review of the Present Cryptographic Arsenal to Deal with Post-Quantum Threats". Procedia Computer Science. 215: 834–845. doi:10.1016/j.procs.2022.12.086.
- Bavdekar, Ritik; Chopde, Eashan Jayant; Bhatia, Ashutosh; Tiwari, Kamlesh; Daniel, Sandeep Joshua (2022). "Post Quantum Cryptography: Techniques, Challenges, Standardization, and Directions for Future Research". arXiv:2202.02826 [cs.CR].
- Joseph, David; Misoczki, Rafael; Manzano, Marc; Tricot, Joe; Pinuaga, Fernando Dominguez; Lacombe, Olivier; Leichenauer, Stefan; Hidary, Jack; Venables, Phil; Hansen, Royal (2022). "Transitioning organizations to post-quantum cryptography". Nature. 605 (7909): 237–243. Bibcode:2022Natur.605..237J. doi:10.1038/s41586-022-04623-2. PMID 35546191.
- Richter, Maximilian; Bertram, Magdalena; Seidensticker, Jasper; Tschache, Alexander (2022). "A Mathematical Perspective on Post-Quantum Cryptography". Mathematics. 10 (15): 2579. doi:10.3390/math10152579.
- Li, Silong; Chen, Yuxiang; Chen, Lin; Liao, Jing; Kuang, Chanchan; Li, Kuanching; Liang, Wei; Xiong, Naixue (2023). "Post-Quantum Security: Opportunities and Challenges". Sensors. 23 (21): 8744. Bibcode:2023Senso..23.8744L. doi:10.3390/s23218744. PMC 10648643. PMID 37960442.
- Dam, Duc-Thuan; Tran, Thai-Ha; Hoang, Van-Phuc; Pham, Cong-Kha; Hoang, Trong-Thuc (2023). "A Survey of Post-Quantum Cryptography: Start of a New Race". Cryptography. 7 (3): 40. doi:10.3390/cryptography7030040.
- Bavdekar, Ritik; Jayant Chopde, Eashan; Agrawal, Ankit; Bhatia, Ashutosh; Tiwari, Kamlesh (2023). "Post Quantum Cryptography: A Review of Techniques, Challenges and Standardizations". 2023 International Conference on Information Networking (ICOIN). pp. 146–151. doi:10.1109/ICOIN56518.2023.10048976. ISBN 978-1-6654-6268-6.
- Sood, Neerav (2024). "Cryptography in Post Quantum Computing Era". SSRN Electronic Journal. doi:10.2139/ssrn.4705470.
- Rawal, Bharat S.; Curry, Peter J. (2024). "Challenges and opportunities on the horizon of post-quantum cryptography". APL Quantum. 1 (2) 026110. doi:10.1063/5.0198344.
- Bagirovs, Emils; Provodin, Grigory; Sipola, Tuomo; Hautamäki, Jari (2024). "Applications of Post-quantum Cryptography". European Conference on Cyber Warfare and Security. 23 (1): 49–57. arXiv:2406.13258. doi:10.34190/eccws.23.1.2247.
- Mamatha, G S; Dimri, Namya; Sinha, Rasha (2024). "Post-Quantum Cryptography: Securing Digital Communication in the Quantum Era". arXiv:2403.11741 [cs.CR].
- Singh, Balvinder; Ahateshaam, Md; Lahiri, Abhisweta; Sagar, Anil Kumar (2024). "Future of Cryptography in the Era of Quantum Computing". Innovations in Electrical and Electronic Engineering. Lecture Notes in Electrical Engineering. Vol. 1115. pp. 13–31. doi:10.1007/978-981-99-8661-3_2. ISBN 978-981-99-8660-6.
External links
[edit]Post-quantum cryptography
View on GrokipediaBackground
Definition and Motivation
Post-quantum cryptography refers to the development of cryptographic algorithms and protocols that remain secure against computational attacks by both classical and quantum computers. Unlike symmetric-key algorithms, which are only mildly affected by quantum speedups, the focus is primarily on public-key systems vulnerable to quantum algorithms, such as those relying on integer factorization or discrete logarithms. Classical public-key cryptography, including RSA introduced in 1978 and elliptic curve cryptography (ECC) proposed in 1985, enables secure key exchange and digital signatures without pre-shared secrets but lacks resistance to quantum threats.[1][2][13] The roots of post-quantum cryptography trace back to early proposals predating widespread awareness of quantum risks, such as the McEliece cryptosystem introduced in 1978, which uses error-correcting codes for public-key encryption. The field formalized following Peter Shor's 1994 algorithm, which showed that quantum computers could efficiently solve problems underlying systems like RSA and ECC. The term "post-quantum cryptography" was coined by Daniel J. Bernstein in the early 2000s, spurring organized research efforts, including the first International Workshop on Post-Quantum Cryptography in 2006.[14][15][16] The urgency stems from advancing quantum technology, which threatens to undermine decades of cryptographic infrastructure protecting communications, financial transactions, and national security data. Without migration to quantum-resistant alternatives, current encryption could be retroactively compromised, exemplified by the "harvest now, decrypt later" threat, where encrypted data is stored today for future quantum decryption. Standardization efforts, led by organizations like NIST since 2016, aim to ensure long-term security for information with extended confidentiality needs.[1][2]Quantum Computing Threats
Shor's algorithm, developed by Peter Shor in 1994, provides a polynomial-time quantum solution to the integer factorization problem and the discrete logarithm problem, which underpin the security of widely used public-key cryptosystems. The algorithm begins by selecting a random number and computing a function whose period it seeks to find; this period-finding subroutine leverages quantum superposition to evaluate the function across multiple inputs simultaneously. A key step involves applying the quantum Fourier transform to the resulting quantum state, which efficiently extracts the period by identifying periodic patterns in the frequency domain, enabling the subsequent classical computation of factors or logarithms. Optimized versions of this approach achieve a complexity of approximately for an -bit number, rendering it exponentially faster than the best known classical algorithms.[17][18] The implications of Shor's algorithm are profound for current public-key cryptography: it can break RSA encryption by factoring large semiprime moduli, compromise Diffie-Hellman key exchange and DSA signatures by solving discrete logarithms in finite fields, and undermine elliptic curve cryptography (ECC) by addressing the elliptic curve discrete logarithm problem. For instance, a sufficiently powerful quantum computer running Shor's algorithm could decrypt RSA-2048 keys, which are standard in secure communications today. These vulnerabilities arise because the hardness assumptions of these systems—large-number factorization and discrete logarithms—collapse under quantum polynomial-time attacks. Grover's algorithm, introduced by Lov Grover in 1996, offers a quadratic speedup for unstructured search problems, reducing the time complexity from to queries on a database of size . In cryptographic contexts, this enables faster brute-force attacks on symmetric ciphers and hash functions by effectively halving their security levels; for example, AES-256, which provides 256 bits of classical security, would be reduced to approximately 128 bits against Grover's search, while SHA-256 hashing would face similar degradation for preimage attacks. A specific application is to Bitcoin address preimage attacks, where Grover's algorithm provides a quadratic speedup for finding a public key that hashes to the target 20-byte RIPEMD-160(SHA-256(compressed public key)) value, reducing the classical brute-force requirement of ~2¹⁶⁰ operations to ~2⁸⁰ oracle queries, plus overhead for reversible quantum circuits simulating SHA-256 and RIPEMD-160. Once such a public key is found, Shor's algorithm could then derive the corresponding private key. However, this remains impractical due to the need for millions of logical qubits with error correction requiring billions of physical qubits, and the execution of 2⁸⁰ complex iterations on a fault-tolerant quantum computer, far beyond current projections; the effective security drops to ~80 bits for RIPEMD-160 but is still considered secure post-quantum given hardware bottlenecks in scale and coherence. To mitigate these threats, the Bitcoin developer community is preparing upgrades via soft forks to incorporate post-quantum secure signatures based on NIST-standardized algorithms, though full migration in the decentralized network could take 5–10 years, requiring users to actively transfer coins to new quantum-resistant addresses; further details on these migration challenges are discussed in the Migration Challenges section. Unlike Shor's exponential advantage, Grover's impact is more modest but still requires countermeasures, such as increasing key sizes to restore security margins.[19][20][21][22][23][24] Breaking practical cryptosystems with these algorithms demands substantial quantum resources, highlighting the current gap between theory and implementation. Recent estimates indicate that factoring a 2048-bit RSA integer using an optimized version of Shor's algorithm could require fewer than one million noisy physical qubits, with circuit depths involving reduced Toffoli gate counts (over 100x fewer than prior estimates) and a runtime of less than one week, assuming a 0.1% gate error rate and surface code error correction. Optimized error-corrected implementations require around 1,500 logical qubits, achievable with fewer than one million physical qubits using advanced codes like yoked surface codes assuming a 0.1% gate error rate. However, achieving reliable logical qubits—essential for fault tolerance—entails overheads leading to millions of physical qubits; for example, error-corrected implementations might need around 10,000 logical qubits for RSA-2048, translating to 20 million or more physical qubits under current error correction schemes. Grover's algorithm for AES-256 key search requires around 1,300 logical qubits but 2^{128} iterations, demanding significantly more overall resources than Shor's algorithm due to the repeated executions, with runtimes estimated from days to over 100,000 years based on circuit depth and hardware cycle times.[25] Progress toward these capabilities is accelerating, as evidenced by IBM's 2025 quantum roadmap, which emphasizes demonstrations of error correction codes and hybrid quantum-HPC algorithms to scale toward fault-tolerant systems. IBM plans to deploy architectures like Quantum Starling, targeting around 200 logical qubits by integrating advanced qLDPC codes that reduce physical qubit overhead by up to 90%, with milestones including real-time error handling on classical chips. Current quantum hardware, such as IBM's systems with hundreds of noisy qubits, remains far from cryptographically relevant scales, but projections suggest viable threats within a decade if scaling continues.[26][27] Despite this progress, as of 2026 quantum computers are not expected to break SHA-256 or ECDSA. Current quantum hardware consists of noisy, intermediate-scale devices with ~1,000 physical qubits and high error rates, far from the fault-tolerant, large-scale systems needed. Shor's algorithm could theoretically break ECDSA but requires thousands of logical qubits (millions of physical with error correction overhead). Grover's algorithm offers only quadratic speedup for SHA-256, leaving it practically secure (2^128 operations still infeasible). Expert consensus places cryptographically relevant quantum computers (CRQC) in the 2030s or later, with no credible projections for 2026.[1][6][25][16] The primary risk from quantum computing falls on public-key systems, which Shor's algorithm can shatter entirely, necessitating immediate migration to post-quantum alternatives for protocols like TLS. Symmetric cryptography faces a lesser but actionable threat from Grover's algorithm, where doubling key lengths (e.g., from AES-128 to AES-256) can mitigate the quadratic speedup and maintain security. Hash functions require analogous adjustments for collision and preimage resistance, underscoring the need for proactive hardening across cryptographic primitives.[6]Cryptographic Primitives
Public-Key Encryption and Key Encapsulation
Public-key encryption schemes in post-quantum cryptography (PQC) are designed to ensure confidentiality of data transmitted over public channels, protecting against decryption by adversaries equipped with quantum computers. These schemes replace classical public-key encryption methods, such as RSA, which are vulnerable to Shor's algorithm that efficiently factors large integers and solves discrete logarithms on quantum hardware.[1] In PQC, encryption primitives maintain semantic security while accommodating the computational threats posed by quantum algorithms. Key encapsulation mechanisms (KEMs) serve a complementary role in PQC by enabling secure key exchange between parties, facilitating the establishment of shared symmetric keys for protocols like TLS without direct transmission of the key material. KEMs encapsulate a randomly generated secret key within a ciphertext that only the intended recipient can decapsulate using their private key, ensuring forward secrecy and resistance to quantum attacks on classical key exchange methods like ECDH.[28] This is critical for hybrid cryptographic systems where PQC KEMs integrate with existing symmetric ciphers to upgrade internet security standards.[29] PQC public-key encryption typically achieves IND-CCA2 security, which protects against adaptive chosen-ciphertext attacks by ensuring that even if an adversary can query a decryption oracle (except for the target ciphertext), they cannot learn the plaintext. For KEMs, the base construction often provides IND-CPA security, which is then elevated to IND-CCA security via the Fujisaki-Okamoto (FO) transform, a hybrid encryption technique that combines the public-key scheme with a symmetric cipher and random oracle to bind the message and randomness securely.[30] The FO transform, originally proposed in 1999 and refined in subsequent works, allows conversion of weakly secure primitives into strongly secure ones, making it a foundational tool for PQC KEM standardization. One of the primary challenges in deploying PQC public-key encryption and KEMs is their increased computational overhead and resource demands compared to classical counterparts. These schemes often require larger public key sizes—typically several kilobytes—to achieve equivalent security levels, leading to higher bandwidth usage and storage needs in protocols like TLS handshakes. Additionally, encryption and decapsulation operations are slower due to the mathematical complexity of underlying problems, such as lattice reductions, which demand more CPU cycles on standard hardware. Despite these trade-offs, ongoing optimizations aim to mitigate performance impacts while preserving IND-CCA security.[1] A representative example of a PQC KEM is the lattice-based approach, where a sender generates a public key consisting of structured vectors in a high-dimensional lattice space and encapsulates the shared key by adding noise to a multiple of the recipient's public key, allowing decapsulation through noise removal via the private key. This general structure, as instantiated in schemes like Kyber, relies on the hardness of learning with errors (LWE) problems, providing efficient key exchange with provable security reductions.Digital Signatures
Digital signatures serve as a core cryptographic primitive in post-quantum cryptography (PQC), enabling the verification of message authenticity, integrity, and non-repudiation while resisting forgery by quantum adversaries. These schemes aim to provide existential unforgeability under chosen-message attacks (EUF-CMA) security in models that account for quantum computation, such as the quantum random oracle model (QROM). Unlike classical digital signatures based on integer factorization or discrete logarithms, PQC signatures must withstand attacks from algorithms like Shor's, which efficiently solve these problems on large-scale quantum computers, thereby rendering traditional schemes such as RSA and ECDSA insecure.[1][10] PQC digital signature designs generally adapt classical paradigms, such as the Fiat-Shamir transform, which converts secure identification protocols into signatures via hash functions assumed secure against quantum attacks. A prominent distinction arises in hash-based constructions between stateful and stateless mechanisms. Stateful schemes require the signer to maintain and monotonically update an internal counter or state after each signature generation to prevent key reuse, which could otherwise enable forgery if nonces or subkeys are repeated. In contrast, stateless schemes generate each signature independently using randomized components, avoiding the need for persistent state. These design choices involve inherent trade-offs in performance and practicality. Stateful approaches typically yield smaller public keys, signatures, and faster generation and verification times—often by factors of 2 to 10 compared to stateless alternatives—due to their efficient use of one-time signature primitives extended via structures like Merkle trees. However, they demand robust state management to ensure sequential signing and prevent errors in distributed or multi-threaded environments, potentially complicating deployment. Stateless designs mitigate these management risks, making them more robust for concurrent use cases, but at the expense of larger signatures (frequently 10-50 kB) and higher computational overhead from additional hashing and randomization steps.[31] PQC-specific challenges for digital signatures include ensuring resistance to quantum-enhanced forgery attacks, where adversaries may exploit superposition to query the signing oracle in parallel, necessitating security reductions in the QROM rather than classical random oracle model. Moreover, the underlying quantum-resistant hard problems often result in substantially larger cryptographic objects than classical equivalents, with signature sizes scaling to several kilobytes to achieve comparable security levels (e.g., 128 bits post-quantum), which poses bandwidth and storage constraints in bandwidth-limited applications like IoT or blockchain. These issues underscore the need for optimized implementations balancing security, efficiency, and usability.[1] Among PQC signature families, hash-based schemes construct security from the second-preimage resistance of quantum-secure hash functions, employing one-time signatures (e.g., via Winternitz or Lamport primitives) aggregated through tree-based methods to support multiple signatures from a single key pair without revealing the private key. Lattice-based signatures, meanwhile, derive security from problems like the short integer solution or learning with errors over lattices, typically using zero-knowledge proofs with rejection mechanisms to ensure signatures are statistically close to uniform distributions, thereby hiding information about the secret key. These approaches provide diverse options, with hash-based emphasizing simplicity and provable security under minimal assumptions, while lattice-based offer potential for smaller sizes through advanced algebraic structures.[1]Symmetric-Key Adaptations
Grover's algorithm poses a threat to symmetric cryptography by providing a quadratic speedup for brute-force key searches, effectively halving the security level of a key of length bits to approximately bits in terms of quantum computational effort.[14] To maintain equivalent post-quantum security, symmetric key lengths must be doubled; for instance, AES-128, which offers 128 bits of classical security, requires upgrading to AES-256 to achieve 128 bits of post-quantum security against Grover's attack.[14] NIST has confirmed that existing AES variants with 128-, 192-, or 256-bit keys remain suitable for post-quantum use when appropriately sized, without needing entirely new primitives.[6] Adaptations for hash functions follow a similar principle, as Grover's algorithm also accelerates preimage attacks, necessitating doubled output sizes to preserve security margins. For example, SHA-256 should be replaced by SHA-512 to retain 128-bit post-quantum preimage resistance.[14] Sponge-based constructions like SHA-3 exhibit inherent resistance to such attacks due to their design, which absorbs input into a state and squeezes out fixed-length outputs, but larger variants (e.g., SHA-3-512) are recommended for heightened security levels.[14] These adjustments ensure that hash-based applications, such as key derivation, maintain integrity against quantum adversaries without overhauling foundational algorithms. In post-quantum protocols, symmetric cryptography serves as an efficient core for bulk data protection, often integrated into hybrid schemes where quantum-resistant key establishment feeds into symmetric encryption for confidentiality.[6] NIST's standardized key encapsulation mechanisms, for example, pair with AES-GCM for authenticated encryption, leveraging symmetric primitives' performance while relying on post-quantum methods for initial key exchange.[6] For message authentication, quantum-resistant MACs like HMAC, when constructed with doubled-length hashes (e.g., HMAC-SHA-512), provide robust integrity protection, as their security reduces to the underlying hash function's post-quantum strength.[32] This approach balances efficiency and security in resource-constrained environments.Algorithm Families
Lattice-Based Cryptography
Lattice-based cryptography is based on the hardness of mathematical problems involving lattices, which can be visualized as infinite grids of points in high-dimensional space. Key problems include the Shortest Vector Problem (SVP), which involves finding the shortest non-zero vector in a lattice, and the Learning With Errors (LWE) problem, which deals with recovering a secret from noisy linear equations over finite fields. These problems are believed to be resistant to attacks by both classical and quantum computers because no efficient algorithms are known to solve them in high dimensions, even with quantum speedups.[33] Lattice-based cryptography relies on the computational hardness of problems defined over lattices, which are discrete subgroups of Euclidean space generated by integer linear combinations of basis vectors. These problems provide a foundation for quantum-resistant cryptographic primitives due to their resistance to attacks by quantum algorithms like Shor's. The field originated in the 1990s with Miklós Ajtai's seminal work demonstrating worst-case to average-case hardness reductions for lattice problems, enabling the construction of secure hash functions and one-way functions based on the difficulty of approximating the shortest vector problem (SVP) in lattices. This breakthrough laid the groundwork for public-key cryptography, with early efficiency improvements realized in the NTRU cryptosystem, which uses polynomial rings to achieve compact keys and fast operations. The core hard problem in lattice-based cryptography is the Learning With Errors (LWE) problem, introduced by Oded Regev in 2005, which posits that it is difficult to recover a secret vector from samples of the form , where are random vectors in , are small errors drawn from a discrete Gaussian distribution, and is a modulus.[34] In matrix form, the search-LWE problem involves finding given , where is a random matrix, has small entries, and is polynomial in . The decision variant, which is computationally indistinguishable from uniform random samples, underpins most constructions and benefits from quantum worst-case hardness reductions to lattice approximation problems like SVP and the shortest independent vectors problem (SIVP).[34] To enhance efficiency, variants like Ring-LWE and Module-LWE restrict the domain to structured rings. Ring-LWE, proposed by Lyubashevsky, Peikert, and Regev in 2010, operates over the ring of polynomials , replacing matrices with polynomial multiplication to reduce key sizes and computation to via fast Fourier transforms, while preserving average-case hardness tied to ideal lattice problems. Module-LWE, defined by Langlois and Stehlé in 2012, generalizes this to modules over polynomial rings, offering a balance between the flexibility of plain LWE and the efficiency of Ring-LWE by using rank greater than 1, which supports parallelizable operations and has been shown hard under worst-case module lattice assumptions.[35] Lattice-based constructions leverage these problems for versatile primitives. Public-key encryption schemes, such as Regev's original LWE-based scheme, use the secret as the private key, with the public key comprising and ; encryption adds noise to a multiple of minus a multiple of , and decryption recovers the message by rounding after subtracting the inner product with , succeeding if noise remains below .[34] Digital signatures are derived via the Fiat-Shamir with aborts paradigm, introduced by Lyubashevsky in 2009, which transforms zero-knowledge identification protocols over LWE into non-interactive signatures by hashing commitments and challenges, with aborts to mask the secret and ensure uniform rejection sampling. These methods extend to key encapsulation mechanisms (KEMs) for secure key exchange. The advantages of lattice-based approaches include their broad applicability across encryption, signatures, and KEMs, as well as robust worst-case to average-case reductions that guarantee security even against the hardest lattice instances, unlike average-case assumptions in other paradigms.[34] For instance, the NIST-selected Kyber KEM is based on Module-LWE, providing efficient post-quantum key exchange.[36]Code-Based Cryptography
Code-based cryptography encompasses public-key schemes whose security is predicated on the computational difficulty of decoding linear error-correcting codes, particularly in the presence of errors. The foundational hard problem is the syndrome decoding problem (SDP), which involves finding a low-weight error vector that produces a given syndrome for a linear code; this problem was proven NP-complete for binary linear codes in 1978. The McEliece problem generalizes this by requiring the recovery of a plaintext from a ciphertext generated using a hidden linear code disguised as a general one, assuming the underlying decoding hardness holds.[37] These assumptions provide resistance against quantum attacks, including Shor's algorithm, as no efficient quantum algorithm is known to solve SDP for random codes.[38] The seminal construction is the McEliece cryptosystem, introduced in 1978, which uses binary Goppa codes to enable public-key encryption: the public key consists of a scrambled generator matrix of the Goppa code, while encryption adds a low-weight error vector to a codeword multiple of the message.[37] For digital signatures, code-based schemes often employ quasi-cyclic (QC) codes to achieve compact keys and efficiency; these leverage the algebraic structure of QC codes for syndrome-based verification, as in the Fiat-Shamir with aborts paradigm adapted to decoding challenges.[39] In March 2025, NIST selected HQC, a code-based KEM using quasi-cyclic moderate-density parity-check codes, for standardization as a backup to the primary lattice-based ML-KEM.[11] Classic McEliece, a modern instantiation using Goppa codes, remains a secure candidate in the NIST process for key encapsulation mechanisms, though its standardization has been delayed due to large key sizes.[1] Key advantages of code-based schemes include their long-standing resistance to cryptanalysis—over 45 years without structural breaks for properly chosen codes—and their reliance on well-studied problems in coding theory.[38] However, a primary challenge is the large public key sizes, often around 1 MB for security levels comparable to AES-128, due to the need for dense generator or parity-check matrices to hide code structure.[38] Variants address these issues: the Niederreiter formulation reframes encryption as syndrome computation with an error vector as the message, using a parity-check matrix for potentially smaller keys in certain settings.[40] Additionally, random linear code encryption (RLCE) schemes use truly random codes without special structure, enhancing security proofs against generic attacks while maintaining efficiency through systematic forms.[41]Multivariate Cryptography
Multivariate cryptography relies on the hardness of solving systems of multivariate quadratic equations over finite fields, primarily the Multilinear Quadratic (MQ) problem, which involves finding a solution to a set of quadratic polynomials in several variables.[42] The MQ problem is NP-complete, providing a foundation for cryptographic primitives resistant to both classical and quantum attacks, as no efficient quantum algorithms are known to solve it.[43] A complementary hard problem is the Isomorphism of Polynomials (IP), which asks whether two sets of multivariate polynomials are equivalent under an invertible affine transformation of variables; this serves as a basis for trapdoor constructions in multivariate schemes.[44] General constructions in multivariate cryptography use trapdoors based on these problems to enable efficient private operations while keeping public evaluation hard. For digital signatures, the Unbalanced Oil and Vinegar (UOV) scheme introduces a trapdoor by partitioning variables into "oil" and "vinegar" sets, where the private key allows solving a structured system that appears random publicly; this was proposed to resist certain algebraic attacks.[45] Encryption variants employ similar layered trapdoors, akin to those in Rainbow-like structures, where the public key is a composition of affine transformations around a central quadratic map, enabling decryption via inversion of the trapdoor.[46] These schemes offer advantages such as fast signature generation and verification, making them suitable for resource-constrained environments in post-quantum settings.[47] However, drawbacks include large public key sizes due to the need to specify numerous polynomial coefficients, and vulnerability to recent attacks; for instance, the Rainbow signature scheme, a prominent multivariate candidate, was broken via key recovery in 2022 using under 53 hours on a single-core laptop.[48] Multivariate cryptography originated in the late 1980s with the Matsumoto-Imai scheme, an early public-key encryption based on quadratic polynomials over finite fields. Following attacks on encryption variants in the 1990s, research shifted focus to signature schemes like UOV, which have proven more resilient.[49]Hash-Based Cryptography
Hash-based cryptography relies on the security of cryptographic hash functions, which are one-way functions designed to be computationally infeasible to invert or find collisions under classical and quantum attacks. In the context of post-quantum cryptography, these schemes primarily focus on digital signatures, leveraging hard problems such as the construction of one-time signatures (OTS) and the second preimage resistance of hash functions. An OTS allows signing a single message securely but becomes insecure if reused, so hash-based methods extend this capability using tree structures to enable multiple signatures from a single key pair while maintaining security. The second preimage resistance ensures that, given a hash output, it is hard to find a different input producing the same output, which is crucial for preventing forgery in these constructions.[50][51] The foundational general construction is the Merkle signature scheme, introduced in 1987, which combines OTS with Merkle trees—a binary tree where leaf nodes represent OTS public keys and the root serves as the overall public key—to allow multiple message signatures without revealing the entire secret key set. In this scheme, signing a message involves generating an OTS for it and providing the authentication path (sibling hashes) from the corresponding leaf to the root, enabling verification against the public root. Modern variants distinguish between stateful and stateless schemes: stateful ones, like the eXtended Merkle Signature Scheme (XMSS), track used keys to avoid reuse and support a fixed number of signatures (e.g., up to 2^60), while stateless schemes, such as SPHINCS, generate signatures randomly from the full key space without state management, trading efficiency for simplicity in deployment.[52][53][54] These schemes offer provable security reductions to the underlying hash function's properties, such as collision resistance and second preimage resistance, without depending on number-theoretic assumptions vulnerable to quantum algorithms like Shor's. They are quantum-resistant provided the hash function remains secure against quantum attacks; for instance, using SHA-3 with doubled output size mitigates Grover's algorithm, which provides a quadratic speedup for searching preimages. However, hash-based signatures typically produce large outputs—often tens of kilobytes due to authentication paths—limiting their use in bandwidth-constrained environments, and they do not naturally extend to encryption or key encapsulation mechanisms.[50][51][53]Isogeny-Based Cryptography
Isogeny-based cryptography relies on the mathematical structure of isogenies, which are rational maps preserving the group law between elliptic curves defined over finite fields, to construct cryptographic primitives resistant to quantum attacks. Unlike classical elliptic curve cryptography, which is vulnerable to Shor's algorithm for solving the elliptic curve discrete logarithm problem, isogeny-based schemes derive their security from the presumed hardness of computing isogenies between supersingular elliptic curves. These curves have complex endomorphism rings that enable non-commutative group actions, providing a foundation for post-quantum security.[55] The core hard problems in this family include the Supersingular Isogeny Diffie-Hellman (SIDH) problem and the more general computational Diffie-Hellman problem on isogeny graphs. In SIDH, given a starting supersingular elliptic curve and two public isogenies from it, the challenge is to compute a secret isogeny connecting the codomains of those public isogenies. Isogeny graphs model the connections between elliptic curves via isogenies of fixed degrees, forming expander graphs where finding short paths (isogeny walks) is computationally difficult, underpinning the security of related protocols.[55][56] Key exchange protocols like SIDH and its encapsulation variant SIKE enable secure shared key generation by having parties compute complementary isogenies along secret paths in the supersingular isogeny graph, revealing only the resulting curves publicly. For digital signatures, constructions such as SQISign use isogeny walks to commit to messages via structured paths in these graphs, allowing verification through reconstruction without revealing the private path. These schemes were developed primarily in the 2010s, with SIDH introduced in 2011 as a quantum-resistant alternative to traditional Diffie-Hellman.[55][57][58] Isogeny-based systems offer advantages in compactness, with public keys and ciphertexts typically around 1 KB for security levels comparable to AES-128, making them suitable for resource-constrained environments. However, in July 2022, a classical polynomial-time attack by Castryck and Decru broke SIKE by exploiting torsion points and glue-and-split techniques on polarized products of supersingular curves, invalidating its security claims and leading to its withdrawal from NIST standardization. This breakthrough has shifted research toward hardened variants, such as those incorporating commutative isogeny actions or alternative graph structures, to restore quantum resistance while preserving small key sizes.[59]Standardization Efforts
NIST Process and Timeline
The National Institute of Standards and Technology (NIST) launched its post-quantum cryptography (PQC) standardization process in December 2016 through a formal call for proposals aimed at identifying public-key algorithms secure against both classical and quantum computing threats.[60] This initiative sought submissions for key encapsulation mechanisms (KEMs) and digital signature schemes, with evaluations spanning multiple rounds to assess candidates rigorously.[1] The process emphasizes algorithms that maintain compatibility with existing cryptographic infrastructures while addressing potential vulnerabilities from quantum algorithms like Shor's. This collaborative effort, involving industry, academia, and the broader cryptographic community, facilitates the sharing of threats and fixes, thereby accelerating the development and adoption of robust post-quantum solutions.[61] Evaluation criteria include security levels categorized by resistance to quantum attacks (e.g., NIST security levels 1–5, corresponding to at least 128 bits of security against quantum adversaries), computational efficiency (runtime and resource demands across hardware platforms), and implementation characteristics such as algorithmic simplicity, randomness requirements, and resilience to side-channel attacks.[60] NIST formed an internal panel and solicited expert feedback to prioritize candidates balancing these factors, with a focus on practical deployment in constrained environments like embedded systems.[62] Submissions were required to provide detailed security analyses, performance benchmarks, and open-source implementations for testing.[63] The timeline began with the April 2016 release of NISTIR 8105, a report outlining PQC needs, followed by the December 2016 call. By November 2017, 82 proposals were received, with 69 advancing to Round 1 evaluation through 2018. In January 2019, 26 candidates (17 KEMs and 9 signatures) proceeded to Round 2, which concluded with a July 2020 report advancing 15 algorithms (7 KEMs and 8 signatures) to Round 3. Round 3, running from 2020 to 2022, narrowed finalists through extensive cryptanalysis, culminating in a July 2022 report selecting initial candidates for standardization. For additional digital signatures, NIST advanced 14 candidates to Round 2 in July 2025, with evaluation continuing and a conference held in September 2025.[64] Round 4, initiated in 2022 for additional KEM diversity, evaluated remaining candidates like HQC, BIKE, and Classic McEliece amid ongoing security reviews.[65] On August 13, 2024, NIST published the first three Federal Information Processing Standards (FIPS): FIPS 203 for ML-KEM (key encapsulation), FIPS 204 for ML-DSA (signatures), and FIPS 205 for SLH-DSA (signatures).[10] In March 2025, HQC was selected as an additional KEM for standardization, as detailed in NISTIR 8545, to provide a code-based alternative enhancing portfolio robustness. Standardization of further candidates, including signature alternates, continues into late 2025.[1] To support adoption, NIST issued SP 1800-38 in September 2025, a multi-volume guide from the National Cybersecurity Center of Excellence outlining preparation steps for PQC migration, including inventory tools and integration frameworks. This complements a September 2025 draft white paper (CSWP 48) mapping PQC migration activities to the NIST Cybersecurity Framework (CSF) and SP 800-53 security controls, aiding federal and organizational risk management. These resources underscore the urgency of migrating to PQC algorithms before "Q-Day," defined as the advent of cryptographically relevant quantum computers capable of breaking current public-key cryptography. This migration is integral to post-quantum readiness efforts, which encompass global initiatives to inventory cryptographic assets, develop migration roadmaps, and implement quantum-safe algorithms to ensure timely upgrades across systems.[66] Global efforts involving organizations like ETSI and ISO/IEC facilitate these widespread system updates.[67] Internationally, efforts align with NIST through ETSI's March 2025 standard for quantum-safe hybrid key exchanges and ISO/IEC initiatives to incorporate PQC into global standards such as ISO/IEC 19790 and 24759, ensuring interoperability.[68][69]Selected Algorithms
In August 2024, the National Institute of Standards and Technology (NIST) finalized its initial post-quantum cryptography standards through Federal Information Processing Standards (FIPS) 203, 204, and 205, selecting algorithms designed to provide security equivalent to AES-128, AES-192, and AES-256 levels against quantum attacks.[10] These include ML-KEM for key encapsulation, and ML-DSA and SLH-DSA for digital signatures, with parameter sets tailored to specific security levels. In March 2025, NIST selected HQC as an additional key encapsulation mechanism to serve as a backup option, with a draft standard anticipated in 2026 and finalization by 2027.[11] ML-KEM, standardized in FIPS 203, is a module-lattice-based key encapsulation mechanism derived from the CRYSTALS-Kyber algorithm, intended primarily for secure key exchange in protocols like TLS.[70] It supports three parameter sets corresponding to NIST security levels 1, 3, and 5 (equivalent to AES-128, AES-192, and AES-256 classical security):| Parameter Set | Security Level | Public Key Size (bytes) | Private Key Size (bytes) | Ciphertext Size (bytes) |
|---|---|---|---|---|
| ML-KEM-512 | 1 | 800 | 1,632 | 768 |
| ML-KEM-768 | 3 | 1,184 | 2,400 | 1,088 |
| ML-KEM-1024 | 5 | 1,568 | 3,168 | 1,568 |
| Parameter Set | Security Level | Public Key Size (bytes) | Private Key Size (bytes) | Signature Size (bytes) |
|---|---|---|---|---|
| ML-DSA-44 | 2 | 1,312 | 2,528 | 2,420 |
| ML-DSA-65 | 3 | 1,952 | 4,000 | 3,293 |
| ML-DSA-87 | 5 | 2,592 | 4,864 | 4,956 |
| Parameter Set | Security Level | Public Key Size (bytes) | Private Key Size (bytes) | Ciphertext Size (bytes) |
|---|---|---|---|---|
| HQC-128 | 1 | 2,249 | 2,305 | 4,433 |
| HQC-192 | 3 | 4,522 | 4,586 | 8,978 |
| HQC-256 | 5 | 7,245 | 7,317 | 14,421 |
Security Analysis
Reductions to Hard Problems
In post-quantum cryptography, the security of proposed schemes is frequently established through provable reductions that connect the scheme's security properties—such as indistinguishability under chosen-ciphertext attack (IND-CCA) for encryption or existential unforgeability for signatures—to the computational hardness of specific mathematical problems. These reductions show that an efficient adversary breaking the scheme could be transformed into an efficient algorithm solving the underlying hard problem, often with only a polynomial loss in security parameters. For instance, many lattice-based encryption schemes reduce IND-CCA security to the hardness of the Learning With Errors (LWE) problem.[74] A prominent example in lattice-based cryptography is the reduction from worst-case lattice problems to average-case instances of LWE, originally established by Regev using a quantum algorithm that preserves hardness across these settings. This reduction underpins the security of schemes like those based on Ring-LWE, a variant over polynomial rings, which further reduces to problems like NTRU encryption and BLISS signatures; specifically, the decision NTRU problem is shown to be at least as hard as search Ring-LWE under certain parameter distributions. These connections rely on Regev's quantum reduction lemma, which ensures that quantum algorithms solving average-case problems imply solutions to worst-case lattice approximation problems like GapSVP. For the Dilithium signature scheme (ML-DSA), security reduces to the hardness of Module-LWE and Module-SIS, providing existential unforgeability under chosen-message attacks (EUF-CMA) security.[34][75][76] In code-based cryptography, the McEliece cryptosystem's security reduces to the syndrome decoding problem for binary linear codes, which involves recovering an error vector from a given syndrome and is proven NP-complete, ensuring that no efficient algorithm exists unless P=NP. Multivariate schemes like Unbalanced Oil and Vinegar (UOV) reduce unforgeability to the multivariate quadratic (MQ) problem, where finding a solution to a random system of quadratic equations over finite fields is assumed intractable; this reduction holds under the MinRank assumption for the trapdoor structure. Hash-based signatures, such as the Merkle signature scheme, reduce to the collision resistance of the underlying hash function combined with the security of one-time signatures like Winternitz one-time signatures (WOTS), where forging a signature would imply finding hash collisions or reusing one-time keys. Isogeny-based schemes like Supersingular Isogeny Diffie-Hellman (SIDH) base their security on the computational difficulty of finding isogeny walks between supersingular elliptic curves, specifically the Supersingular Computational Diffie-Hellman (SSCDH) assumption, which posits that an adversary cannot compute shared secrets without knowing the private isogeny paths in the supersingular isogeny graph. Key types of reductions in post-quantum cryptography include worst-case to average-case transformations, particularly in lattices, where Regev's quantum reduction links hard worst-case problems (e.g., approximating shortest vectors) to solvable average-case LWE instances via a quantum sampling procedure. Another type is quantum non-malleability in reductions, ensuring that quantum adversaries cannot malleate queries to preserve classical proof structures in the quantum random oracle model (QROM).[34][77] Despite these advances, some post-quantum schemes face limitations in their reductions; for example, hash-based signatures often rely on the random oracle model (ROM), which idealizes hash functions as truly random but introduces gaps when instantiated with concrete functions like SHA-3, potentially weakening provable security against non-ideal attacks. Tightness issues also arise in quantum reductions, where security loss factors can be super-polynomial without careful parameter tuning.[78]Efficiency and Security Comparisons
Post-quantum cryptography (PQC) algorithms are evaluated based on their efficiency metrics, including public key sizes, ciphertext or signature sizes, and computational costs measured in clock cycles or milliseconds on standard hardware, alongside security levels defined by NIST as equivalents to AES-128 (level 1), AES-192 (level 3), and AES-256 (level 5) against both classical and quantum adversaries.[1] These metrics reveal trade-offs: lattice-based schemes like ML-KEM and ML-DSA offer balanced sizes and speeds suitable for general use, while code-based systems like Classic McEliece provide strong security but with significantly larger keys, and hash-based signatures like SLH-DSA excel in verification speed at the cost of larger signatures. Benchmarks from the Open Quantum Safe (OQS) project, using the liboqs library, provide standardized performance data across platforms, showing lattice algorithms typically require 10^5 to 10^6 cycles for key operations on x86-64 CPUs, compared to 10^7 or more for code-based key generation.[79] Recent 2025 hardware accelerations, such as GPU implementations for lattice operations, have reduced encapsulation times by up to 10x for ML-KEM on NVIDIA devices.[80] The following table summarizes representative sizes for selected PQC algorithms at NIST security level 1 (AES-128 equivalent), highlighting family differences; higher levels scale sizes proportionally larger for increased security. Lattice-based public keys remain under 2 KB, enabling compatibility with existing protocols, whereas code-based keys exceed 250 KB, posing challenges for bandwidth-constrained environments like IoT.[70][71][72]| Algorithm Family | Algorithm | Public Key Size (bytes) | Secret Key Size (bytes) | Ciphertext/Signature Size (bytes) | Citation |
|---|---|---|---|---|---|
| Lattice-based (KEM) | ML-KEM-512 | 800 | 1,632 | 768 (ciphertext) | [70] |
| Lattice-based (Signature) | ML-DSA-44 | 1,312 | 2,560 | 2,420 (signature) | [71] |
| Hash-based (Signature) | SLH-DSA-SHA2-128s | 32 | 64 | 7,856 (signature) | [72] |
| Code-based (KEM) | Classic McEliece mceliece348864 | 261,120 | 6,492 | 128 (ciphertext) | |
| Multivariate (Signature, historical) | Rainbow (pre-breakage) | ~50,000 | ~50,000 | ~2,000 (signature) | |
| Isogeny-based (KEM, historical) | SIKE (pre-breakage) | 564 | 48 | 564 (ciphertext) |
