Recent from talks
Nothing was collected or created yet.
Key (cryptography)
View on WikipediaA key in cryptography is a piece of information, usually a string of numbers or letters that are stored in a file, which, when processed through a cryptographic algorithm, can encode or decode cryptographic data. Based on the used method, the key can be different sizes and varieties, but in all cases, the strength of the encryption relies on the security of the key being maintained. A key's security strength is dependent on its algorithm, the size of the key, the generation of the key, and the process of key exchange.
Scope
[edit]The key is what is used to encrypt data from plaintext to ciphertext.[1] There are different methods for utilizing keys and encryption.
Symmetric cryptography
[edit]Symmetric cryptography refers to the practice of the same key being used for both encryption and decryption.[2]
Asymmetric cryptography
[edit]Asymmetric cryptography has separate keys for encrypting and decrypting.[3][4] These keys are known as the public and private keys, respectively.[5]
Purpose
[edit]Since the key protects the confidentiality and integrity of the system, it is important to be kept secret from unauthorized parties. With public key cryptography, only the private key must be kept secret, but with symmetric cryptography, it is important to maintain the confidentiality of the key. Kerckhoff's principle states that the entire security of the cryptographic system relies on the secrecy of the key.[6]
Key sizes
[edit]Key size is the number of bits in the key defined by the algorithm. This size defines the upper bound of the cryptographic algorithm's security.[7] The larger the key size, the longer it will take before the key is compromised by a brute force attack. Since perfect secrecy is not feasible for key algorithms, researches are now more focused on computational security.
In the past, keys were required to be a minimum of 40 bits in length, however, as technology advanced, these keys were being broken quicker and quicker. As a response, restrictions on symmetric keys were enhanced to be greater in size.
Currently, 2048 bit RSA[8] is commonly used, which is sufficient for current systems. However, current RSA key sizes would all be cracked quickly with a powerful quantum computer.[9]
"The keys used in public key cryptography have some mathematical structure. For example, public keys used in the RSA system are the product of two prime numbers. Thus public key systems require longer key lengths than symmetric systems for an equivalent level of security. 3072 bits is the suggested key length for systems based on factoring and integer discrete logarithms which aim to have security equivalent to a 128 bit symmetric cipher."[10]
Key generation
[edit]To prevent a key from being guessed, keys need to be generated randomly and contain sufficient entropy. The problem of how to safely generate random keys is difficult and has been addressed in many ways by various cryptographic systems. A key can directly be generated by using the output of a Random Bit Generator (RBG), a system that generates a sequence of unpredictable and unbiased bits.[11] A RBG can be used to directly produce either a symmetric key or the random output for an asymmetric key pair generation. Alternatively, a key can also be indirectly created during a key-agreement transaction, from another key or from a password.[12]
Some operating systems include tools for "collecting" entropy from the timing of unpredictable operations such as disk drive head movements. For the production of small amounts of keying material, ordinary dice provide a good source of high-quality randomness.
Establishment scheme
[edit]The security of a key is dependent on how a key is exchanged between parties. Establishing a secured communication channel is necessary so that outsiders cannot obtain the key. A key establishment scheme (or key exchange) is used to transfer an encryption key among entities. Key agreement and key transport are the two types of a key exchange scheme that are used to be remotely exchanged between entities . In a key agreement scheme, a secret key, which is used between the sender and the receiver to encrypt and decrypt information, is set up to be sent indirectly. All parties exchange information (the shared secret) that permits each party to derive the secret key material. In a key transport scheme, encrypted keying material that is chosen by the sender is transported to the receiver. Either symmetric key or asymmetric key techniques can be used in both schemes.[12]
The Diffie–Hellman key exchange and Rivest-Shamir-Adleman (RSA) are the most two widely used key exchange algorithms.[13] In 1976, Whitfield Diffie and Martin Hellman constructed the Diffie–Hellman algorithm, which was the first public key algorithm. The Diffie–Hellman key exchange protocol allows key exchange over an insecure channel by electronically generating a shared key between two parties. On the other hand, RSA is a form of the asymmetric key system which consists of three steps: key generation, encryption, and decryption.[13]
Key confirmation delivers an assurance between the key confirmation recipient and provider that the shared keying materials are correct and established. The National Institute of Standards and Technology recommends key confirmation to be integrated into a key establishment scheme to validate its implementations.[12]
Management
[edit]Key management concerns the generation, establishment, storage, usage and replacement of cryptographic keys. A key management system (KMS) typically includes three steps of establishing, storing and using keys. The base of security for the generation, storage, distribution, use and destruction of keys depends on successful key management protocols.[14]
Key vs password
[edit]A password is a memorized series of characters including letters, digits, and other special symbols that are used to verify identity. It is often produced by a human user or a password management software to protect personal and sensitive information or generate cryptographic keys. Passwords are often created to be memorized by users and may contain non-random information such as dictionary words.[12] On the other hand, a key can help strengthen password protection by implementing a cryptographic algorithm which is difficult to guess or replace the password altogether. A key is generated based on random or pseudo-random data and can often be unreadable to humans.[15]
A password is less safe than a cryptographic key due to its low entropy, randomness, and human-readable properties. However, the password may be the only secret data that is accessible to the cryptographic algorithm for information security in some applications such as securing information in storage devices. Thus, a deterministic algorithm called a key derivation function (KDF) uses a password to generate the secure cryptographic keying material to compensate for the password's weakness. Various methods such as adding a salt or key stretching may be used in the generation.[12]
See also
[edit]- Cryptographic key types
- Diceware
- EKMS
- Group key
- Keyed hash algorithm
- Key authentication
- Key derivation function
- Key distribution center
- Key escrow
- Key exchange
- Key generation
- Key management
- Key schedule
- Key server
- Key signature (cryptography)
- Key signing party
- Key stretching
- Key-agreement protocol
- glossary
- Password psychology
- Public key fingerprint
- Random number generator
- Session key
- Tripcode
- Machine-readable paper key
- Weak key
References
[edit]- ^ Piper, Fred (2002), "Cryptography", Encyclopedia of Software Engineering, American Cancer Society, doi:10.1002/0471028959.sof070, ISBN 978-0-471-02895-6, retrieved 2021-04-09
- ^ "What is a cryptographic key? | Keys and SSL encryption".
- ^ "Asymmetric-Key Cryptography". cs.cornell.edu. Retrieved 2021-04-02.
- ^ Chandra, S.; Paira, S.; Alam, S. S.; Sanyal, G. (2014). "A comparative survey of Symmetric and Asymmetric Key Cryptography". 2014 International Conference on Electronics, Communication and Computational Engineering (ICECCE). pp. 83–93. doi:10.1109/ICECCE.2014.7086640. ISBN 978-1-4799-5748-4. S2CID 377667.
- ^ Kumar, M. G. V.; Ragupathy, U. S. (March 2016). "A Survey on current key issues and status in cryptography". 2016 International Conference on Wireless Communications, Signal Processing and Networking (WiSPNET). pp. 205–210. doi:10.1109/WiSPNET.2016.7566121. ISBN 978-1-4673-9338-6. S2CID 14794991.
- ^ Mrdovic, S.; Perunicic, B. (September 2008). "Kerckhoffs' principle for intrusion detection". Networks 2008 - the 13th International Telecommunications Network Strategy and Planning Symposium. Vol. Supplement. pp. 1–8. doi:10.1109/NETWKS.2008.6231360. ISBN 978-963-8111-68-5.
- ^ "What is Key Length? - Definition from Techopedia". Techopedia.com. 16 November 2011. Retrieved 2021-05-01.
- ^ Hellman, Martin. "An Overview of Public Key Cryptography" (PDF). IEEE Communications Magazine.
- ^ "Toward a code-breaking quantum computer". MIT News | Massachusetts Institute of Technology. 2024-08-23. Retrieved 2025-05-14.
- ^ "Anatomy of a change – Google announces it will double its SSL key sizes". Naked Security. 2013-05-27. Archived from the original on 8 September 2023. Retrieved 2021-04-09.
- ^ Dang, Quynh (August 2012). "Recommendation for Applications Using Approved Hash Algorithms" (PDF). Retrieved 2021-04-02.
- ^ a b c d e Turan, M. S.; Barker, E. B.; Burr, W. E.; Chen, L. (2010). Recommendation for password-based key derivation (PDF) (Report). doi:10.6028/NIST.SP.800-132. S2CID 56801929.
- ^ a b Yassein, M. B.; Aljawarneh, S.; Qawasmeh, E.; Mardini, W.; Khamayseh, Y. (2017). "Comprehensive study of symmetric key and asymmetric key encryption algorithms". 2017 International Conference on Engineering and Technology (ICET). pp. 1–7. doi:10.1109/ICEngTechnol.2017.8308215. ISBN 978-1-5386-1949-0. S2CID 3781693.
- ^ Barker, Elaine (January 2016). "Recommendation for Key Management" (PDF). Retrieved 2021-04-02.
- ^ Khillar, Sagar (29 April 2020). "Difference Between Encryption and Password Protection | Difference Between". Retrieved 2021-04-02.
Key (cryptography)
View on GrokipediaFundamentals
Definition
A cryptographic key is a value used to control cryptographic operations, such as encryption, decryption, signature generation, or signature verification.[2] Typically, it consists of a fixed-length string of bits or bytes that serves as a parameter for a cryptographic algorithm.[4] In symmetric encryption, the key is used in the encryption function to transform plaintext into ciphertext , and in the decryption function to recover the original plaintext.[5] This notation illustrates the key's role in reversible data transformation within symmetric schemes, though keys are also employed in asymmetric cryptography for related purposes.[6] Symmetric keys and private keys in asymmetric systems must remain secret to prevent unauthorized access to protected data, as their disclosure compromises the security of the associated algorithms.[4] They require sufficient length to resist brute-force attacks— for instance, providing at least 112 bits of security strength for many applications—and must be generated with high entropy to ensure unpredictability and randomness.[7][4] Unlike non-cryptographic keys, such as API keys used primarily for access control and authentication without transforming data via algorithms, cryptographic keys are integral to mathematical operations that secure or verify information.[8]Historical Development
The concept of a cryptographic key originated in ancient times with simple substitution ciphers, such as the Caesar cipher employed by Julius Caesar in the 1st century BCE, where the key was a fixed numerical shift in the alphabet, typically by three positions, to encode military messages.[9] This monoalphabetic approach relied on secrecy of the shift value as the key, marking an early instance of keys as secret parameters for encryption.[10] By the 16th century, advancements led to polyalphabetic ciphers like the Vigenère cipher, introduced by Blaise de Vigenère, which used a repeating keyword to select different Caesar shifts for each letter, enhancing security through a longer effective key length compared to single-shift methods.[11][12] During World War II, mechanical rotor machines represented a significant evolution in key usage, exemplified by the German Enigma machine, which employed sets of rotating wheels (rotors) configurable daily with specific wiring, initial positions, and plugboard settings as the shared key among operators to scramble messages across millions of possible configurations.[13] These settings functioned as the key, changed frequently to maintain secrecy, and enabled the encryption of vast military communications until Allied cryptanalysts exploited patterns in key management to break the system.[14] The advent of digital computing in the 1970s shifted cryptographic keys toward algorithmic standards, beginning with the Data Encryption Standard (DES) adopted by the U.S. National Bureau of Standards in 1977, which utilized fixed 56-bit keys for symmetric encryption of 64-bit blocks, balancing computational feasibility with security for early government and commercial applications.[15] Concurrently, public-key cryptography emerged with the Diffie-Hellman key exchange protocol published in 1976, introducing asymmetric keys where public parameters facilitated secure key agreement without prior shared secrets, followed by the RSA algorithm in 1977, which employed large prime-based key pairs for both encryption and digital signatures.[16][17] In the modern era, the Advanced Encryption Standard (AES), standardized by NIST in 2001, superseded DES with variable key sizes of 128, 192, or 256 bits for symmetric block encryption, addressing growing computational threats and becoming the de facto global standard for data protection.[18] Key lengths evolved further amid export restrictions; early Secure Sockets Layer (SSL) implementations in the 1990s often limited to 40-bit keys for international use, proving vulnerable to brute-force attacks, prompting transitions to stronger sizes.[19] By the 2010s, concerns over quantum computing threats intensified following Peter Shor's 1994 algorithm, which demonstrated potential to factor large numbers efficiently on quantum hardware, endangering asymmetric keys like RSA and driving research into quantum-resistant alternatives.[20] As of 2024, NIST guidelines approve AES with key sizes of 128, 192, or 256 bits, stating that even 128-bit keys provide sufficient security against known quantum attacks for the foreseeable future.[21] In August 2024, NIST released the first three finalized post-quantum cryptographic standards—FIPS 203 (ML-KEM for key encapsulation), FIPS 204 (ML-DSA for digital signatures), and FIPS 205 (SLH-DSA for digital signatures)—providing quantum-resistant alternatives to traditional asymmetric keys.[22]Types of Cryptographic Keys
Symmetric Keys
Symmetric keys are cryptographic keys utilized in symmetric cryptography, where a single secret key is shared between communicating parties and employed for both encryption and decryption processes. This key must remain confidential and is not disclosed publicly to ensure security.[23] Symmetric key algorithms apply the same key to perform an operation and its inverse, such as transforming plaintext to ciphertext and vice versa.[24] These keys excel in scenarios requiring high-speed processing, making them ideal for bulk data encryption where efficiency is paramount. However, their use necessitates robust mechanisms for secure key distribution, as the shared nature of the key introduces risks if intercepted.[25] Common implementations include block ciphers like the Advanced Encryption Standard (AES), a Federal Information Processing Standard that processes data in 128-bit blocks using keys of 128, 192, or 256 bits, often in modes such as Cipher Block Chaining (CBC) for confidentiality or Galois/Counter Mode (GCM) for authenticated encryption.[18][26][27] Stream ciphers, such as ChaCha20, produce a continuous keystream from the key to XOR with the plaintext, offering strong performance in software implementations.[28] An example of a deprecated stream cipher is RC4, which was widely used but prohibited in protocols like TLS after 2015 due to identified weaknesses.[29] From a security perspective, symmetric keys are susceptible to compromise; if the key is exposed during sharing or storage, an attacker can decrypt all associated data, undermining the entire protection scheme. They also lack inherent non-repudiation, as possession of the key allows any holder to create or verify messages without proving origin. To mitigate distribution vulnerabilities, symmetric keys are frequently integrated into hybrid cryptosystems, where asymmetric key pairs enable secure initial key exchange.Asymmetric Key Pairs
Asymmetric key pairs, also known as public-key pairs, form the core of public-key cryptography, consisting of two mathematically related components: a public key that can be freely distributed to anyone, and a private key that must remain secret with its owner.[30] These keys are generated together using algorithms grounded in computationally difficult mathematical problems, such as the integer factorization problem for RSA or the elliptic curve discrete logarithm problem for schemes like ECDSA.[31][32] The linkage ensures that operations performed with one key cannot be easily reversed without the other, allowing secure communication without the need to share a secret beforehand.[33] In typical usage, the public key is employed by others to encrypt messages intended for the key owner or to verify digital signatures created by that owner, while the private key is used solely by the owner to decrypt those messages or to generate the signatures.[30] This separation of roles enables non-repudiation and confidentiality without direct secret sharing.[33] Prominent examples include the RSA algorithm, introduced in 1977 by Ron Rivest, Adi Shamir, and Leonard Adleman, which relies on the difficulty of factoring the product of two large primes and typically uses key sizes of 2048 bits or larger for adequate security.[31][34] Another key example is elliptic curve cryptography (ECC), proposed in the 1980s but widely adopted in the 2000s following endorsements like the NSA's Suite B standards in 2005, where a 256-bit key provides security comparable to a 3072-bit RSA key due to the efficiency of elliptic curve mathematics.[35][36] The primary advantages of asymmetric key pairs lie in their ability to facilitate secure key exchange for symmetric encryption—such as establishing shared session keys over insecure channels—and to support digital signatures for authentication and integrity verification.[33] They serve as the foundational building block for public key infrastructure (PKI), which manages the distribution, validation, and revocation of public keys through certificates to enable scalable trust in digital systems.[37] As quantum computing advances threaten classical schemes like RSA and ECC, post-quantum asymmetric key pairs based on lattice problems (e.g., ML-KEM for key encapsulation and ML-DSA for signatures), hash functions (e.g., SLH-DSA for signatures), or code-based problems (e.g., HQC for key encapsulation) are emerging, with NIST finalizing initial standards in August 2024 and selecting additional algorithms such as HQC for standardization in March 2025 to ensure long-term security.[38][3]Applications and Purposes
In Encryption and Decryption
Cryptographic keys play a central role in ensuring confidentiality by enabling the transformation of plaintext data into ciphertext through encryption, and the reversal of this process via decryption, thereby protecting sensitive information from unauthorized access whether stored at rest or transmitted over networks.[26] In symmetric encryption, a single shared key is used for both encrypting the data at the sender's end and decrypting it at the recipient's end, facilitating efficient bulk data protection in scenarios requiring mutual secrecy.[39] For instance, in the Transport Layer Security (TLS) protocol, symmetric keys—such as those derived from AES—are generated during the handshake to encrypt session data, ensuring secure communication over the internet.[39] In asymmetric encryption, also known as public-key cryptography, a public key is employed to encrypt the data, while the corresponding private key is used solely by the recipient for decryption, allowing secure transmission without prior key sharing.[40] This approach is exemplified in Pretty Good Privacy (PGP) for email encryption, where the sender uses the recipient's public key to encrypt a symmetric session key, which then encrypts the message body, combining efficiency with secure key distribution.[40] Keys interact with block cipher modes of operation to process data in fixed-size blocks, addressing issues like data expansion or patterns in encryption output; for example, the Electronic Codebook (ECB) mode applies the key independently to each block, while Counter (CTR) mode uses the key to generate a keystream for XORing with plaintext, providing better security against block repetition attacks without requiring padding in all cases.[26] Padding schemes, such as PKCS#7, may be applied in certain modes to ensure plaintext aligns with block sizes when using the key for encryption.[26] In practical applications, full-volume file encryption tools like Microsoft's BitLocker utilize symmetric AES keys in XTS mode to protect data on storage devices, rendering files inaccessible without the decryption key.[41] Similarly, Virtual Private Networks (VPNs) employing IPsec with the Encapsulating Security Payload (ESP) protocol rely on symmetric keys to encrypt IP packets, safeguarding data in transit across untrusted networks.[42]In Authentication and Signing
In authentication and signing, cryptographic keys play a pivotal role in verifying the authenticity, integrity, and origin of data, ensuring that messages or entities cannot be forged or altered without detection. Asymmetric keys, in particular, enable digital signatures where the private key signs a message digest (typically a hash of the data), producing a signature that the corresponding public key can verify, thus providing assurance that the data has not been tampered with and originates from the claimed signer.[43] This mechanism contrasts with symmetric approaches but complements hybrid systems briefly referenced in broader protocols. Digital signatures rely on algorithms such as the Digital Signature Algorithm (DSA), which uses a private key to generate a signature over a hash of the message, verifiable by the public key to confirm integrity and authenticity.[44] Although DSA is now legacy for new generations under current standards, it exemplifies the process where the signer's private key creates a pair of integers (r, s) from the hash, and verification recomputes and matches them using the public key.[43] More modern variants include the Edwards-curve Digital Signature Algorithm (EdDSA), which signs a hash with the private key—derived from a seed and hashed for security—and verifies with the public key point on the Edwards curve, offering high speed and resistance to side-channel attacks.[45] For message authentication without full non-repudiation, symmetric keys are employed in constructs like the Hash-based Message Authentication Code (HMAC), where a shared secret key is combined with a hash function (e.g., SHA-256) to produce a tag authenticating the message's integrity and origin against an attacker lacking the key.[46] The HMAC computation involves inner and outer padded hashes of the key and message, ensuring that any alteration invalidates the tag upon recomputation and verification with the same key.[46] In Public Key Infrastructure (PKI), keys within X.509 certificates bind a public key to an entity's identity via a digital signature from a trusted Certificate Authority (CA), enabling entity authentication by verifying the certificate chain and using the public key for subsequent signature validations.[47] The certificate's subjectPublicKeyInfo field specifies the public key and algorithm, while extensions like keyUsage indicate permitted uses such as digitalSignature for authentication, ensuring the key's role in verifying signatures during protocols like TLS handshakes.[47] Practical examples illustrate these roles: In Secure Shell (SSH) logins, a client authenticates using a key pair where the private key signs a challenge from the server, and the server verifies with the pre-registered public key to grant access without passwords.[48] Similarly, code signing in software distribution, such as Apple's notarization process, uses a developer's private key to sign binaries with a digital signature over their hash, allowing verifiers to confirm integrity and origin via the public key in a Developer ID certificate before execution.[49] Asymmetric keys in these contexts provide non-repudiation, where a valid digital signature proves the signer's identity and intent, preventing denial of authorship since only the private key holder could have produced it, as validated by third-party verification.[50] This property is foundational in legal and secure communications, supported by the mathematical unlinkability of signing from encryption.[51]Security Considerations
Key Sizes and Strength
In cryptography, the size of a key is typically measured in bits, representing the length of the binary string used in the algorithm. For symmetric encryption algorithms like AES, a larger key size exponentially increases the computational effort required for a brute-force attack, where an adversary exhaustively tries all possible keys; for instance, breaking a 128-bit key would demand approximately 2^128 operations, which is infeasibly large with current technology.[25] Similarly, in asymmetric cryptography, key size affects resistance to attacks like integer factorization for RSA, though the relationship is not linear due to the underlying mathematical hardness assumptions.[25] The strength of a cryptographic key extends beyond its size to encompass the algorithm's overall resistance to cryptanalytic attacks, including weaknesses in the design that could allow faster exploitation than brute force. For example, the Data Encryption Standard (DES) with its 56-bit key was considered insecure by the late 1990s; in 1998, the Electronic Frontier Foundation's DES cracker hardware broke a DES-encrypted message in under three days by brute force, highlighting how short keys combined with maturing hardware render algorithms obsolete.[52] In contrast, modern standards like AES-256 maintain high strength even against advanced attacks, providing 256 bits of security for symmetric operations against classical computers, far surpassing DES.[25] Key strength is often quantified in "bits of security," a metric estimating the resources needed to break the key, accounting for the most efficient known attacks rather than just the nominal bit length.[25] Current recommendations from the National Institute of Standards and Technology (NIST), based on classical security assessments in SP 800-57 Revision 5 (2020) and SP 800-131A Revision 2 (2019), specify minimum key sizes to achieve desired security levels against classical threats through at least 2030. For symmetric algorithms, NIST advises a minimum of 128 bits (e.g., AES-128), with 256 bits recommended for long-term confidentiality; for RSA, at least 2048 bits (providing 112 bits of security) until 2030, transitioning to 3072 bits or more (128 bits of security) thereafter; and for elliptic curve cryptography (ECC), 256 bits (128 bits of security).[25][53] These guidelines derive from assessments of attack costs, where 128 bits of security is deemed adequate for most applications against classical computers.[25] However, with the finalization of post-quantum cryptography (PQC) standards in August 2024 (FIPS 203 for ML-KEM key encapsulation, FIPS 204 for ML-DSA signatures, and FIPS 205 for SLH-DSA signatures), NIST recommends transitioning away from classical asymmetric algorithms like RSA and ECC, which are vulnerable to Shor's algorithm, toward PQC alternatives by 2035 to maintain security against quantum threats. For symmetric keys, Grover's algorithm reduces effective security by half, so AES-256 provides 128 bits post-quantum security, while AES-128 offers only 64 bits and should be avoided for long-term use. PQC key sizes vary; for example, ML-KEM at security level 1 (128 bits equivalent) uses a public key of 800 bytes.[38][3] Key strength must be evaluated against various attack models, including brute force (exhaustive search), dictionary attacks (targeting predictable keys), and side-channel attacks (exploiting implementation leaks like timing or power consumption). Effective bits of security adjust the nominal size based on these vulnerabilities; for example, a 2048-bit RSA key offers approximately 112 bits of security against the best-known factorization methods, such as the general number field sieve.[25][21] Emerging quantum threats, particularly Grover's algorithm for symmetric keys, necessitate adjustments like using AES-256 for 128-bit post-quantum security, while Shor's algorithm renders current asymmetric keys like RSA vulnerable, prompting a shift to post-quantum algorithms such as those in FIPS 203-205. High-quality randomness during key generation is essential to realize these strength levels, as poor entropy can reduce effective security regardless of size.[21][25] The following table compares recommended minimum key sizes across common algorithms for achieving at least 128 bits of classical security, per NIST guidelines (SP 800-57 Rev. 5 and SP 800-131A Rev. 2); post-quantum equivalents are noted separately:| Algorithm Type | Example Algorithm | Minimum Key Size (bits) | Effective Security (bits, classical) | Notes |
|---|---|---|---|---|
| Symmetric | AES | 256 | 256 | 128 bits minimum for short-term; AES-256 recommended for post-quantum (128 bits quantum security via resistance to Grover's algorithm).[25][21] |
| Asymmetric (RSA) | RSA | 3072 | 128 | 2048 bits (112 bits) acceptable until 2030; deprecation recommended post-2030 and with PQC transition by 2035.[53] |
| Asymmetric (DH) | Diffie-Hellman | 3072 | 128 | Finite-field variant; ECC alternative preferred; both vulnerable to quantum, transition to PQC key agreement (e.g., ML-KEM, public key 800 bytes for level 1).[25][38] |
| Elliptic Curve | ECDSA/ECDH | 256 | 128 | Curve like NIST P-256; more efficient than RSA but vulnerable to Shor's; PQC alternatives like ML-DSA preferred.[25] |
| Post-Quantum (KEM) | ML-KEM (Kyber) | N/A (byte-based) | 128 | Level 1: public key 800 bytes; standardized in FIPS 203 (2024).[38] |
Entropy and Randomness Requirements
In cryptography, entropy quantifies the degree of randomness or unpredictability in a data source, typically measured in bits, where higher values indicate greater uncertainty about the data's value.[54] For cryptographic keys, full entropy is essential, meaning the minimum entropy of the key material must equal its bit length to ensure it cannot be feasibly guessed or predicted by an adversary.[55] For instance, an AES-256 key requires 256 bits of full entropy to provide its intended security level, as lower entropy would reduce the effective key space and vulnerability to brute-force attacks.[56] This requirement aligns with key sizes serving as minimum entropy thresholds, where the security strength of a key is bounded by its entropy rather than just its length. Entropy sources for key generation include hardware random number generators (RNGs), such as Intel's RDRAND instruction, which leverages on-chip thermal noise to produce high-quality random bits suitable for cryptographic use.[57] Software-based sources, like Linux's /dev/urandom, draw from system entropy pools seeded by events such as disk I/O and interrupts, providing a convenient interface for generating pseudorandom bits when properly initialized.[58] However, predictable seeds must be avoided in all cases, as they can compromise the output's randomness and lead to key reuse or predictability across sessions.[55] Standards such as NIST Special Publication 800-90A, released in 2012, specify requirements for deterministic random bit generators (DRBGs) and emphasize that entropy inputs must meet or exceed the security strength needed for the application.[59] This standard highlights risks from weak RNGs, exemplified by the 2008 Debian OpenSSL vulnerability, where a code change reduced the entropy pool to just 15-18 bits, enabling attackers to predict SSH and SSL keys generated during that period.[60] To validate RNG outputs, test suites like Dieharder, which includes over 30 statistical tests for detecting non-random patterns, and the NIST Statistical Test Suite (STS), comprising 15 tests for cryptographic randomness, are commonly employed.[61][62] For enhanced true randomness, quantum random number generators (QRNGs) exploit quantum phenomena like photon detection statistics, offering provably unpredictable entropy immune to classical prediction.[63] Commercial adoption of QRNGs, such as those from ID Quantique, accelerated post-2010 with integrations into secure systems for high-stakes applications like key generation in financial and government sectors.[64]Key Generation
Methods for Generating Keys
Cryptographic keys are typically generated using cryptographically secure pseudorandom number generators (CSPRNGs) to ensure high entropy and unpredictability, which are essential for security.[7] The process begins with seeding the CSPRNG from a high-entropy source, such as hardware noise or system events, followed by expansion to the desired key length through iterative cryptographic operations like hashing or block cipher modes.[7] Notable CSPRNG designs include Yarrow, which employs a combination of SHA-1 hashing for entropy accumulation and Triple-DES encryption in counter mode for output generation, originally developed for software implementations.[65] Fortuna, an evolution of Yarrow, enhances resilience by pooling entropy from multiple sources and reseeding periodically to resist attacks from biased inputs. Hardware-based methods provide enhanced security by isolating key generation within tamper-resistant environments. Trusted Platform Modules (TPMs) are dedicated secure processors that generate keys using internal random number generators compliant with standards like NIST SP 800-90, ensuring keys never leave the module unprotected.[66] Similarly, smart cards utilize on-board cryptographic coprocessors to produce keys, often incorporating physical entropy sources like ring oscillators, which prevents exposure during creation. Software libraries facilitate key generation in application contexts while leveraging underlying system CSPRNGs. The OpenSSL toolkit'srand command, for instance, draws from platform-specific entropy pools to produce random bytes suitable for keys, as in openssl rand -hex 32 for a 256-bit AES key.[67] Python's secrets module offers functions like secrets.token_bytes(32) for generating secure random data, explicitly designed for cryptographic use and avoiding weaker alternatives like the random module.[68]
Best practices emphasize generating keys in isolated, secure environments to minimize side-channel risks, such as using dedicated hardware or virtualized sandboxes, and immediately zeroizing temporary buffers after use to prevent memory leaks.[7] Direct incorporation of user-provided input as entropy should be avoided due to potential predictability; instead, rely on system-collected sources. Keys must possess adequate entropy—typically at least the bit length of the key itself—to resist brute-force attacks.[7]
A practical example is the generation of AES keys for disk encryption in the Linux Unified Key Setup (LUKS) format, where a random 256-bit master key is created using the kernel's CSPRNG during device formatting with cryptsetup luksFormat, enabling secure storage of encrypted data.[69]
