Recent from talks
Contribute something
Nothing was collected or created yet.
Code signing
View on WikipediaThis article needs additional citations for verification. (January 2012) |
Code signing is the process of digitally signing executables and scripts to confirm the software author and guarantee that the code has not been altered or corrupted since it was signed. The process employs the use of a cryptographic hash to validate authenticity and integrity.[1] Code signing was invented in 1995 by Michael Doyle, as part of the Eolas WebWish browser plug-in, which enabled the use of public-key cryptography to sign downloadable Web app program code using a secret key, so the plug-in code interpreter could then use the corresponding public key to authenticate the code before allowing it access to the code interpreter's APIs.[2][3]
Code signing can provide several valuable features. The most common use of code signing is to provide security when deploying; in some programming languages, it can also be used to help prevent namespace conflicts. Almost every code signing implementation will provide some sort of digital signature mechanism to verify the identity of the author or build system, and a checksum to verify that the object has not been modified. It can also be used to provide versioning information about an object or to store other metadata about an object.[4]
The efficacy of code signing as an authentication mechanism for software depends on the security of underpinning signing keys. As with other public key infrastructure (PKI) technologies, the integrity of the system relies on publishers securing their private keys against unauthorized access. Keys stored in software on general-purpose computers are susceptible to compromise. Therefore, it is more secure, and best practice, to store keys in secure, tamper-proof, cryptographic hardware devices known as hardware security modules or HSMs.[5]
Providing security
[edit]Many code signing implementations will provide a way to sign the code using a system involving a pair of keys, one public and one private, similar to the process employed by TLS or SSH. For example, in the case of .NET, the developer uses a private key to sign their libraries or executables each time they build. This key will be unique to a developer or group or sometimes per application or object. The developer can either generate this key on their own or obtain one from a trusted certificate authority (CA).[6]
Code signing is particularly valuable in distributed environments, where the source of a given piece of code may not be immediately evident - for example Java applets, ActiveX controls and other active web and browser scripting code. Another important usage is to safely provide updates and patches to existing software.[7] Windows, Mac OS X, and most Linux distributions provide updates using code signing to ensure that it is not possible for others to maliciously distribute code via the patch system. It allows the receiving operating system to verify that the update is legitimate, even if the update was delivered by third parties or physical media (disks).[8]
Code signing is used on Windows and Mac OS X to authenticate software on first run, ensuring that the software has not been maliciously tampered with by a third-party distributor or download site. This form of code signing is not used on Linux because of that platform's decentralized nature, the package manager being the predominant mode of distribution for all forms of software (not just updates and patches), as well as the open-source model allowing direct inspection of the source code if desired. Debian-based Linux distributions (among others) validate downloaded packages using public key cryptography.[9]
Trusted identification using a certificate authority (CA)
[edit]The public key used to authenticate the code signature should be traceable back to a trusted root authority CA, preferably using a secure public key infrastructure (PKI). This does not ensure that the code itself can be trusted, only that it comes from the stated source (or more explicitly, from a particular private key).[10] A CA provides a root trust level and is able to assign trust to others by proxy. If a user trusts a CA, then the user can presumably trust the legitimacy of code that is signed with a key generated by that CA or one of its proxies. Many operating systems and frameworks contain built-in trust for one or more certification authorities. It is also commonplace for large organizations to implement a private CA, internal to the organization, which provides the same features as public CAs, but it is only trusted within the organization.
Extended validation (EV) code signing
[edit]Extended validation (EV) code signing certificates are subject to additional validation and technical requirements. These guidelines are based on the CA/B Forum's Baseline Requirements and Extended Validation Guidelines. In addition to validation requirements specific to EV, the EV code signing guidelines stipulate that "the Subscriber's private key is generated, stored and used in a crypto module that meets or exceeds the requirements of FIPS 140-2 level 2."[11]
Certain applications, such as signing Windows 10 kernel-mode drivers, require an EV code signing certificate.[12] Additionally, Microsoft's IEBlog states that Windows programs "signed by an EV code signing certificate can immediately establish reputation with SmartScreen reputation services even if no prior reputation exists for that file or publisher."[13]
Sample EV code signing certificate
[edit]This is an example of a decoded EV code signing certificate used by SSL.com to sign software. SSL.com EV Code Signing Intermediate CA RSA R3 is shown as the Issuer's commonName, identifying this as an EV code signing certificate. The certificate's Subject field describes SSL Corp as an organization. Code Signing is shown as the sole X509v3 Extended Key Usage.
Certificate:
Data:
Version: 3 (0x2)
Serial Number:
59:4e:2d:88:5a:2c:b0:1a:5e:d6:4c:7b:df:35:59:7d
Signature Algorithm: sha256WithRSAEncryption
Issuer:
commonName = SSL.com EV Code Signing Intermediate CA RSA R3
organizationName = SSL Corp
localityName = Houston
stateOrProvinceName = Texas
countryName = US
Validity
Not Before: Aug 30 20:29:13 2019 GMT
Not After : Nov 12 20:29:13 2022 GMT
Subject:
1.3.6.1.4.1.311.60.2.1.3 = US
1.3.6.1.4.1.311.60.2.1.2 = Nevada
streetAddress = 3100 Richmond Ave Ste 503
businessCategory = Private Organization
postalCode = 77098
commonName = SSL Corp
serialNumber = NV20081614243
organizationName = SSL Corp
localityName = Houston
stateOrProvinceName = Texas
countryName = US
Subject Public Key Info:
Public Key Algorithm: rsaEncryption
Public-Key: (2048 bit)
Modulus:
00:c3:e9:ae:be:d7:a2:6f:2f:24 ...
Exponent: 65537 (0x10001)
X509v3 extensions:
X509v3 Authority Key Identifier:
keyid:36:BD:49:FF:31:2C:EB:AF:6A:40:FE:99:C0:16:ED:BA:FC:48:DD:5F
Authority Information Access:
CA Issuers - URI:http://www.ssl.com/repository/SSLcom-SubCA-EV-CodeSigning-RSA-4096-R3.crt
OCSP - URI:http://ocsps.ssl.com
X509v3 Certificate Policies:
Policy: 2.23.140.1.3
Policy: 1.2.616.1.113527.2.5.1.7
Policy: 1.3.6.1.4.1.38064.1.3.3.2
CPS: https://www.ssl.com/repository
X509v3 Extended Key Usage:
Code Signing
X509v3 CRL Distribution Points:
Full Name:
URI:http://crls.ssl.com/SSLcom-SubCA-EV-CodeSigning-RSA-4096-R3.crl
X509v3 Subject Key Identifier:
EC:6A:64:06:26:A7:7A:69:E8:CC:06:D5:6F:FA:E1:C2:9A:29:79:DE
X509v3 Key Usage: critical
Digital Signature
Signature Algorithm: sha256WithRSAEncryption
17:d7:a1:26:58:31:14:2b:9f:3b ...
Alternative to CAs
[edit]The other model is the trust on first use model, in which developers can choose to provide their own self-generated key. In this scenario, the user would normally have to obtain the public key in some fashion directly from the developer to verify the object is from them for the first time. Many code signing systems will store the public key inside the signature. Some software frameworks and OSs that check the code's signature before executing will allow you to choose to trust that developer from that point on after the first run. An application developer can provide a similar system by including the public keys with the installer. The key can then be used to ensure that any subsequent objects that need to run, such as upgrades, plugins, or another application, are all verified as coming from that same developer.
Time-stamping
[edit]Time-stamping was designed to circumvent the trust warning that will appear in the case of an expired certificate. In effect, time-stamping extends the code trust beyond the validity period of a certificate.[14]
In the event that a certificate has to be revoked due to a compromise, a specific date and time of the compromising event will become part of the revocation record. In this case, time-stamping helps establish whether the code was signed before or after the certificate was compromised.[14]
Developers need to sign their iOS and tvOS apps before running them on any real device and before uploading them to the App Store. This is needed to prove that the developer owns a valid Apple Developer ID. An application needs a valid profile or certificate so that it can run on the devices.[15]
Problems
[edit]Like any security measure, code signing can be defeated. Users can be tricked into running unsigned code, or even into running code that refuses to validate, and the system only remains secure as long as the private key remains private.[16][17]
It is also important to note that code signing does not protect the end user from any malicious activity or unintentional software bugs by the software author — it merely ensures that the software has not been modified by anyone other than the author. Sometimes, sandbox systems do not accept certificates, because of a false time-stamp or because of an excess usage of RAM.
Implementations
[edit]Microsoft implements a form of code signing (based on Authenticode) provided for Microsoft tested drivers. Since drivers run in the kernel, they can destabilize the system or open the system to security holes. For this reason, Microsoft tests drivers submitted to its WHQL program. After the driver has passed, Microsoft signs that version of the driver as being safe. On 32-bit systems only, installing drivers that are not validated with Microsoft is possible after agreeing to allow the installation at a prompt warning the user that the code is unsigned. For .NET (managed) code, there is an additional mechanism called Strong Name Signing that uses Public/Private keys and SHA-1 hash as opposed to certificates. However, Microsoft discourages reliance on Strong Name Signing as a replacement for Authenticode.[18]
The Code Signing Working Group of the CA/Browser Forum decided that starting June 1, 2023, all code signing certificates (not only the EA ones) should mandate private key storage on a physical media, such as in a hardware crypto module conforming to at least FIPS 140-2 Level 2 or Common Criteria EAL 4+.[19] The CAs subsequently issued announcements on compliance with the decision.[20][21][22][23][24][25][26]
Unsigned code in gaming and consumer devices
[edit]In the context of consumer devices such as games consoles, the term "unsigned code" is often used to refer to an application which has not been signed with the cryptographic key normally required for software to be accepted and executed. Most console games have to be signed with a secret key designed by the console maker or the game will not load on the console (both to enforce Vendor lock-in and combat software piracy). There are several methods to get unsigned code to execute which include software exploits, the use of a modchip, a technique known as the swap trick or running a softmod.
It may not initially seem obvious why simply copying a signed application onto another DVD does not allow it to boot. On the Xbox, the reason for this is that the Xbox executable file (XBE) contains a media-type flag, which specifies the type of media that the XBE is bootable from. On nearly all Xbox software, this is set such that the executable will only boot from factory-produced discs, so simply copying the executable to burnable media is enough to stop the execution of the software.
However, since the executable is signed, simply changing the value of the flag is not possible as this alters the signature of the executable, causing it to fail validation when checked.
See also
[edit]References
[edit]- ^ "Introduction to Code Signing (Windows)". learn.microsoft.com. August 15, 2017. Archived from the original on February 6, 2024. Retrieved March 13, 2024.
- ^ "WebWish: Our Wish is Your Command".
- ^ Schroeder, H and Doyle, M. “Interactive Web Applications with Tcl/Tk”. Academic Professional, Boston, 1998, p. 14, ISBN 0122215400.
- ^ Hendric, William (2015). "A Complete overview of Trusted Certificates - CABForum" (PDF). Archived (PDF) from the original on 2019-04-22. Retrieved 2015-02-26.
- ^ "Securing your Private Keys as Best Practice for Code Signing Certificates" (PDF).
- ^ Hendric, William (17 June 2011). "What is Code Signing?". Archived from the original on 20 June 2018. Retrieved 26 February 2015.
- ^ "Digital Signatures and Windows Installer - Win32 apps". learn.microsoft.com. January 7, 2021. Archived from the original on January 30, 2024. Retrieved March 13, 2024.
- ^ windows-driver-content (2022-05-18). "Windows Secure Boot Key Creation and Management Guidance". learn.microsoft.com. Archived from the original on 2023-10-30. Retrieved 2023-09-22.
- ^ "SecureApt - Debian Wiki". wiki.debian.org. Archived from the original on 2019-05-07. Retrieved 2019-05-07.
- ^ "Code signing" (PDF). 2014-02-26. Archived (PDF) from the original on 2014-02-26. Retrieved 2014-02-21.
- ^ "Guidelines For The Issuance And Management Of Extended Validation Code Signing Certificates" (PDF). CA/Browser Forum. Archived (PDF) from the original on 27 November 2019. Retrieved 4 December 2019.
- ^ "Driver Signing Policy". Microsoft. Archived from the original on 9 December 2019. Retrieved 9 December 2019.
- ^ "Microsoft SmartScreen & Extended Validation (EV) Code Signing Certificates". Microsoft. 14 August 2012. Archived from the original on 9 December 2019. Retrieved 9 December 2019.
- ^ a b Morton, Bruce. "Code Signing" (PDF). CASC. Archived (PDF) from the original on 26 February 2014. Retrieved 21 February 2014.
- ^ "Distributing your app to registered devices". Apple Developer Documentation. Archived from the original on 2024-03-13. Retrieved 2024-01-15.
- ^ "Fake antivirus solutions increasingly have stolen code-signing certificates". 9 January 2014. Archived from the original on 16 April 2014. Retrieved 14 April 2014.
- ^ "Why Private Keys Are the Achilles' Heel of Code Signing Security".
- ^ ".NET Security Blog". learn.microsoft.com. August 6, 2021. Archived from the original on January 19, 2024. Retrieved March 13, 2024.
- ^ "Baseline Requirements for the Issuance and Management of Publicly-Trusted Code Signing Certificates" (PDF). CA/Browser Forum. 2024. p. 10. Archived (PDF) from the original on March 13, 2024. Retrieved March 22, 2024.
(Section 1.2.2) [...] Effective June 1, 2023, for Code Signing Certificates, CAs SHALL ensure that the Subscriber's Private Key is generated, stored, and used in a suitable Hardware Crypto Module that meets or exceeds the requirements specified in section 6.2.7.4.1 using one of the methods in 6.2.7.4.2.
- ^ "Code Signing - Storage of Private Keys | SignPath". SignPath - Code Signing Simple and Secure. Archived from the original on 2024-03-08. Retrieved 2024-03-13.
- ^ "Code signing changes in 2021". knowledge.digicert.com. Archived from the original on 2023-12-10. Retrieved 2024-03-13.
- ^ "DigiCert timeline: Code signing's new private key storage requirement". knowledge.digicert.com. Archived from the original on 2023-12-08. Retrieved 2024-03-13.
- ^ "New private key storage requirement for Code Signing certificates". knowledge.digicert.com. Archived from the original on 2024-02-19. Retrieved 2024-03-13.
- ^ "[CSCWG-public] Voting results Ballot CSCWG-17: Subscriber Private Key Extension". 26 September 2022. Archived from the original on 2022-12-05. Retrieved 2024-03-13.
- ^ "Code Signing Key Storage Requirements Will Change on June 1, 2023". Archived from the original on October 2, 2023. Retrieved March 13, 2024.
- ^ "🥇 New private key storage requirement for all Code Signing certificates - June 2023 (Update)". SSLPOINT. September 23, 2022. Archived from the original on September 22, 2023. Retrieved March 13, 2024.
External links
[edit]Code signing
View on GrokipediaFundamentals
Definition and Purpose
Code signing is a security process in which software developers attach a digital signature to executables, binaries, or scripts, employing public-key cryptography to verify the software's origin and ensure it has not been altered or tampered with since signing.[4] This mechanism allows end-users and systems to confirm that the code originates from a legitimate source, thereby distinguishing trusted software from potentially malicious alterations during distribution.[5] The primary purposes of code signing are to guarantee software integrity by detecting any post-signing modifications, authenticate the developer's identity to establish provenance, and foster trust in software distribution channels by preventing malware from masquerading as legitimate applications.[6] By embedding this cryptographic assurance, code signing mitigates risks associated with unverified code execution, such as the introduction of vulnerabilities or unauthorized changes.[7] Code signing emerged in the mid-1990s alongside the development of digital signature standards like PKCS#7, which was published by RSA Security in the early 1990s.[8] It gained widespread adoption in the late 1990s for enterprise software distribution, driven by the need to secure executable content in growing networked environments, with technologies like Microsoft's Authenticode introduced in 1996.[9] This process typically relies on digital certificates issued by trusted certificate authorities to bind the signature to a verified identity.[10] Among its key benefits, code signing significantly reduces the risk of executing malicious or compromised code by providing verifiable proof of unaltered software.[11] It also enables operating systems and platforms to implement execution policies, such as restricting or blocking the running of unsigned applications to enhance overall system security.[12]Technical Mechanism
The technical mechanism of code signing relies on asymmetric cryptography to ensure the integrity and authenticity of software executables, scripts, or other code artifacts. Developers begin by generating a public-private key pair using established cryptographic libraries, where the private key remains secret and the public key is associated with a digital certificate.[13][4] The code is then processed through a hashing algorithm to produce a fixed-size digest representing its contents; for example, SHA-256 is commonly used to generate a 256-bit hash value that uniquely identifies the unaltered code.[4][14] This hash is encrypted with the developer's private key to create a digital signature, which serves as proof that the code has not been modified since signing.[15] The signature, along with the associated public key certificate, is embedded into the code's metadata structure, such as the Portable Executable (PE) format for Windows binaries, forming a self-contained signed package.[13] Verification occurs at runtime or during installation when the receiving system recomputes the hash of the current code and compares it to the hash extracted from the embedded signature. If the hashes match, the system decrypts the signature using the public key to retrieve the original hash and confirm its validity, thereby establishing the code's integrity.[4] The process then validates the public key's certificate chain, tracing back through intermediate certificates to a trusted root certificate authority (CA) to ensure the signer's identity is authentic and the certificate has not expired or been revoked.[4] This chain validation relies on standards like X.509, which defines the structure for public key certificates including fields for the subject's name, public key, validity period, and issuer signature.[16] Code signing employs standardized formats to encapsulate signatures and certificates, primarily the Cryptographic Message Syntax (CMS) as specified in RFC 5652, which evolved from PKCS#7 and supports signed data structures with multiple signers, digest algorithms, and optional attributes.[17] In CMS, the SignedData content type includes the encapsulated content info, certificates, and signer infos, where each signer info contains the signature value computed over the digest and signed attributes.[17] Hash algorithms have evolved to address security vulnerabilities; early implementations used MD5 (128-bit) and later SHA-1 (160-bit), but due to collision attacks, modern code signing mandates stronger algorithms like SHA-256 from the SHA-2 family or SHA-3 for enhanced resistance to cryptanalytic attacks.[14] Practical implementation involves tools for generating and applying signatures. OpenSSL, an open-source cryptography library, provides command-line utilities likeopenssl cms for creating CMS/PKCS#7 signatures on arbitrary data, enabling custom code signing workflows.[4] These tools integrate with build systems such as Maven or CMake, allowing automated signing during compilation to embed signatures without manual intervention.[4]
Security Features
Certificate Authorities and Trusted Identification
Certificate authorities (CAs) serve as trusted third-party entities that verify the identity of software developers or organizations before issuing X.509 digital certificates for code signing. Examples include DigiCert and Sectigo, which act as independent validators to ensure that only legitimate entities receive these certificates.[18][10] The primary role of a CA in this context is to perform due diligence on the applicant's identity, thereby establishing a foundation of trust that allows end-users and systems to authenticate the origin and integrity of signed code without direct knowledge of the signer.[19][20] The trust model underpinning code signing certificates relies on a hierarchical chain within the public key infrastructure (PKI). An end-user code signing certificate is digitally signed by an intermediate CA, which is itself signed by higher-level intermediates or ultimately by a root CA. Root CAs are pre-trusted, with their public keys embedded in operating system and application trust stores, such as those in Microsoft Windows or Apple ecosystems.[21][22] This chain enables verifiers to recursively validate each certificate against the issuer's public key, culminating in confirmation against the trusted root, thus preventing forgery or impersonation in the signing process.[23] The issuance process begins with the developer generating a key pair and submitting a certificate signing request (CSR) along with proof of identity to the CA. For organizations, this typically includes business registration documents, tax IDs, or addresses; for individuals, government-issued photo identification such as passports or driver's licenses is required. Per CA/B Forum Baseline Requirements, effective June 1, 2023, the private key must be generated, stored, and used exclusively within a cryptographic module certified to FIPS 140-2 Level 2 or Common Criteria EAL 4+ to protect against compromise.[24] The CA then conducts organization validation (OV), which involves confirming the entity's legal existence, operational address, and operational control through independent sources like public records or phone verification.[25][26][27] Upon approval, the CA issues the X.509 certificate, which embeds the developer's public key, distinguished name, serial number, and a validity period—historically up to 39 months, though the CA/Browser Forum has mandated a reduction to a maximum of 460 days for certificates issued after March 1, 2026.[28][29][30] To address compromised or invalid certificates, CAs implement revocation mechanisms that allow real-time or periodic checks of certificate status. Certificate Revocation Lists (CRLs) are digitally signed files published by the CA at regular intervals, listing the serial numbers of revoked certificates along with revocation reasons and dates. Alternatively, the Online Certificate Status Protocol (OCSP) enables on-demand queries to the CA's server for the status of a specific certificate, providing responses such as "good," "revoked," or "unknown."[31][32] These tools ensure that systems can detect and reject signatures from invalidated certificates, maintaining the overall security of the code signing ecosystem.[33]Extended Validation Certificates
Extended Validation (EV) certificates for code signing represent a high-assurance standard established by the CA/B Forum, requiring certificate authorities to perform thorough identity vetting of the applicant organization. This process verifies legal existence by confirming registration with the relevant incorporating or registration agency in the subject's jurisdiction, physical existence through validation of a business presence at a specified address, and operational existence to ensure active business operations as of the issuance date. The vetting, which involves document review, database checks, and potential phone verification, typically spans several days to a week or more, depending on the applicant's responsiveness and the complexity of the organization.[28][34][35] In contrast to Organization Validated (OV) or Domain Validated (DV) certificates, which rely on less stringent checks like basic domain control or organizational details, EV certificates mandate audited compliance with CA/B Forum guidelines, including ongoing CA process audits for reliability. This results in certificates featuring unique identifiers, such as the EV policy Object Identifier (OID) 2.23.140.1.1, enabling operating systems to recognize and afford elevated trust to EV-signed code. Key fields in these X.509 certificates include the subject organization name, serial number for uniqueness, and additional attributes like jurisdiction of incorporation and physical address components, all encoded to provide verifiable transparency without including domain names.[28][36][37] EV-signed executables in Microsoft Windows environments display the verified organization name as the publisher in User Account Control (UAC) prompts, replacing generic "unknown" warnings with identifiable details, while also receiving immediate positive reputation from Microsoft SmartScreen to minimize or eliminate download and execution alerts. This visual and behavioral trust enhancement helps users confidently identify legitimate software publishers.[38][39][40] Adoption of EV code signing certificates is common in enterprise software development, where they are often required for distribution through platforms like the Microsoft Store or for compliance in regulated industries to demonstrate rigorous identity assurance. Certificate authorities such as Entrust and GlobalSign provide these certificates, with annual pricing typically ranging from $300 to $500, reflecting the intensive validation and hardware security module requirements.[41][42][43]Time-Stamping Protocols
Time-stamping protocols in code signing attach a trusted timestamp to a digital signature, proving that the signature was created at a specific point in time and enabling verification even after the signing certificate expires. This is achieved through a Time-Stamping Authority (TSA), a trusted third party that generates time-stamp tokens using a reliable time source, as defined in the Internet X.509 Public Key Infrastructure Time-Stamp Protocol (TSP) outlined in RFC 3161.[44] Per CA/B Forum Baseline Requirements, effective April 15, 2025, TSA private keys for Root and Subordinate CA certificates (with validity over 72 months) must be protected in a hardware cryptographic module certified to FIPS 140-2 Level 3 or Common Criteria EAL 4+, maintained in a high-security zone. Examples of TSAs include free services like FreeTSA.org, which provides RFC 3161-compliant timestamps without cost for basic use, and commercial providers such as Sectigo (formerly Comodo), which offers timestamping via http://timestamp.sectigo.com.[](https://www.freetsa.org/index_en.php)[](https://www.sectigo.com/resource-library/time-stamping-server)[](https://cabforum.org/uploads/Baseline-Requirements-for-the-Issuance-and-Management-of-Code-Signing.v3.9.pdf) The process begins after the code is signed with a private key; the signer submits a hash of the signature (typically using SHA-256 in modern implementations) to the TSA via an HTTP or TCP request formatted according to RFC 3161.[44][45] The TSA verifies the request, appends the current UTC time from a trusted source (such as NTP-synchronized clocks), signs the hash with its own certificate, and returns a TimeStampToken containing the timestamp information, including a serial number for uniqueness and the hashing algorithm used.[44] This token is then embedded into the signature envelope, often as an unsigned attribute in CMS/PKCS #7 structures, ensuring the timestamp is cryptographically bound to the original signature.[44] These protocols provide several benefits for code signing security. By establishing the exact creation time of the signature, time-stamping prevents replay attacks, as verifiers can check that the timestamp aligns with the expected temporal context and detect any attempts to reuse outdated signatures.[46] It also supports long-term validity, allowing signatures to be verified post-certificate expiration as long as the timestamp falls within the certificate's validity period and the TSA's certificate chain remains trustworthy, which is crucial for archival integrity of software artifacts.[44][47] Integration of time-stamping is seamless in common tools; for instance, Microsoft's SignTool.exe automates the process using the /tr option to specify a TSA URL, such as http://timestamp.sectigo.com, and supports SHA-256 hashing for requests without additional configuration for basic services.[48] Many TSAs, including FreeTSA.org and Sectigo, default to SHA-256 for compatibility and security, offering no-cost options for non-commercial or low-volume use while ensuring compliance with RFC 3161 standards.[49][45]Alternatives to Certificate Authorities
Self-signed certificates represent a basic alternative to traditional Certificate Authorities (CAs) in code signing, where developers generate their own public-private key pair and certificate using tools like OpenSSL or PowerShell's New-SelfSignedCertificate cmdlet.[50] These certificates are suitable for internal tools, development, or testing environments, as they allow signing without external validation, but they inherently lack third-party trust since no CA vouches for the issuer's identity.[51] Verification depends on manual distribution of the public key to recipients, who must explicitly trust it by importing it into their local certificate store, such as the Trusted People store on Windows.[50] Web of trust models, inspired by Pretty Good Privacy (PGP), provide a decentralized approach where users mutually vouch for each other's public keys through signatures, forming chains of trust without a central authority.[52] In open-source projects, this is implemented via tools like GnuPG, with keys distributed through keyservers or repositories; for instance, the Linux kernel community uses PGP signatures on Git tags and tarballs, relying on the web of trust to verify maintainer identities post the 2011 kernel.org compromise.[53] Trust levels are calculated based on signature paths from known trusted keys, enabling collaborative verification in ecosystems like Linux distributions where developers sign each other's keys to build collective assurance.[54] Decentralized options extend this further by leveraging distributed technologies for identity and verification, bypassing CA hierarchies altogether. Projects like Sigstore enable keyless code signing through OpenID Connect (OIDC) providers for identity proof, issuing short-lived certificates via Fulcio and logging signatures in the tamper-evident Rekor transparency log for public auditability.[55][56] Blockchain-based methods, such as anchoring code hashes or signatures to Ethereum for timestamping, provide immutable proof of existence and integrity without centralized issuance, often combined with smart contracts for verification.[57] Hardware Security Modules (HSMs) support these by securely generating and storing keys in tamper-resistant hardware, facilitating self-signed or decentralized signing while ensuring private keys never leave the device.[58] These alternatives offer significant trade-offs compared to CA-based systems: they reduce costs and accelerate issuance by eliminating vetting processes, making them ideal for open-source or internal use, as seen in Git's support for GPG-signed commits where developers verify authenticity via personal keyrings. However, they increase risks of impersonation due to the absence of independent identity validation, requiring robust key distribution and user diligence to mitigate potential supply chain threats.[51][58]Challenges and Limitations
Common Security Problems
One major vulnerability in code signing arises from the theft or compromise of private keys associated with code signing certificates. When attackers gain access to these keys, they can sign malicious code as if it originated from a trusted entity, bypassing verification mechanisms and enabling widespread distribution of malware.[59] A prominent example is the 2011 breach of DigiNotar, a Dutch certificate authority, where intruders compromised the private keys and issued over 500 fraudulent certificates, including code signing ones, affecting millions of users primarily through man-in-the-middle attacks on services like Gmail in Iran. This incident led to the revocation of DigiNotar's root certificates across major trust stores and the company's bankruptcy.[60][61] Similarly, the 2020 SolarWinds supply chain attack involved Russian state-sponsored actors injecting malware into legitimate software updates, which were then signed using SolarWinds' legitimate code signing certificate after compromising the build process, compromising thousands of organizations including U.S. government agencies.[62][63] In 2023, attackers stole encrypted code signing certificates from GitHub, including those for GitHub Desktop and Atom, potentially allowing malicious software to be signed as legitimate GitHub releases; GitHub revoked the certificates and advised users to update affected software.[64] Algorithmic weaknesses in hashing functions used for code signing signatures further exacerbate risks. Deprecated algorithms like SHA-1 are susceptible to collision attacks, where attackers generate two different files with identical hashes, allowing substitution of malicious code without invalidating the signature. The 2017 SHAttered attack demonstrated the first practical collision for SHA-1, producing two distinct PDFs with the same hash, highlighting its vulnerability for digital signatures including code signing; despite transitions to stronger hashes like SHA-256, legacy SHA-1-signed code remains in use, delaying full mitigation.[65][66] Timestamping failures can undermine the long-term validity of code signatures by failing to provide reliable proof of signing time relative to certificate expiration or revocation. Outages or connectivity issues with Time-Stamping Authorities (TSAs) prevent acquisition of valid timestamps during signing, rendering signatures time-bound to the certificate's validity period and potentially invalidating them prematurely. For instance, the 2019 expiration of Comodo's TSA certificate (timestamp.comodoca.com) caused widespread errors and outages in timestamped code validation across various environments. Additionally, use of untrusted or compromised TSAs allows attackers to forge timestamps; in one described scenario, an adversary intercepts timestamp requests and supplies a response from a non-trustworthy TSA, leading verifiers to accept invalid signatures.[67][68][69][70] Other systemic issues include signature stripping in repackaged malware, where attackers decompile legitimate signed applications, remove the original digital signature, inject malicious payloads, and redistribute the altered unsigned or re-signed binaries to evade detection. Over-reliance on centralized trust stores amplifies risks from root CA compromises; the 2015 Symantec incidents involved multiple misissuances of rogue certificates, including an unauthorized Extended Validation certificate for google.com issued without proper validation, prompting employee terminations and widespread distrust of Symantec roots by browsers like Chrome. These events exposed how flaws in CA operations can propagate untrusted certificates into trust stores, enabling fake code signing.[71][72][73][74][75]Mitigation Strategies
Mitigation strategies for code signing vulnerabilities focus on proactive measures to protect private keys, ensure cryptographic robustness, integrate verification into development workflows, and enable rapid detection and response to compromises. These practices help developers and organizations minimize risks such as unauthorized code distribution and supply chain attacks by emphasizing secure handling, standards compliance, and ongoing monitoring.[76] Key management is a cornerstone of code signing security, beginning with the use of Hardware Security Modules (HSMs) for private key storage to prevent unauthorized access and extraction. HSMs provide tamper-resistant environments that isolate keys from software-based threats, ensuring that signing operations occur within protected hardware.[76] Regular key rotation—typically every 1-2 years or after potential exposure—limits the impact of a compromised key by reducing its lifespan and validity period.[77] Additionally, enabling multi-factor authentication (MFA) for Certificate Authority (CA) accounts and key access controls adds layers of identity verification, thwarting credential-based attacks.[78] Updating cryptographic algorithms addresses evolving threats to hashing integrity, with a mandate to transition to SHA-256 or stronger variants following the 2017 SHAttered collision attack on SHA-1, which demonstrated practical forgery risks for code signing. The National Institute of Standards and Technology (NIST) deprecated SHA-1 for digital signatures in 2013 and fully retired it by December 31, 2030, urging immediate adoption of SHA-2 and SHA-3 families to maintain collision resistance.[79] Microsoft accelerated this by deprecating SHA-1 code signing support in 2017, requiring SHA-256 for new certificates to align with browser and OS enforcement.[80] Organizations should monitor NIST guidelines and conduct periodic audits to ensure compliance with these post-2017 standards.[81] Enhancing verification involves embedding code signing policies directly into Continuous Integration/Continuous Deployment (CI/CD) pipelines to automate integrity checks during builds and deployments. Tools like Cosign, developed by the Sigstore project, facilitate container image signing and verification without long-term key management, using short-lived keys and transparency logs for reproducible attestations.[82] This integration ensures that only signed artifacts proceed to production, reducing the window for tampering in automated workflows.[83] For incident response, continuous monitoring of Online Certificate Status Protocol (OCSP) responders and Certificate Revocation Lists (CRLs) is essential to detect and enforce revocations promptly, as OCSP provides real-time status queries while CRLs offer batch updates for offline validation.[31] Supply chain risk audits, guided by the Supply-chain Levels for Software Artifacts (SLSA) framework introduced in 2021, evaluate build provenance and integrity controls to identify weaknesses before deployment.[84] SLSA's tiered levels promote verifiable builds and signed artifacts, enabling organizations to respond to breaches by revoking affected certificates and tracing impacted distributions.[84]Implementations
Apple Ecosystems
In Apple's ecosystems, code signing is a mandatory security mechanism for distributing and executing software on macOS and iOS platforms, ensuring that applications originate from verified developers and remain untampered. It integrates deeply with the App Store distribution model, where all submitted apps must be signed using Apple-issued certificates to pass review and installation checks. For macOS, Gatekeeper enforces signing by verifying Developer ID certificates on downloaded apps, preventing execution of unsigned or tampered code outside the App Store. Similarly, iOS requires signed apps bundled with provisioning profiles to install on devices, tying code to specific developer identities and device capabilities.[85][2][86] Certificates for code signing are issued by the Apple Worldwide Developer Relations (WWDR) Certification Authority, an intermediate authority under Apple's public key infrastructure that validates developer identities through the Apple Developer Program. Developers generate certificate signing requests via Keychain Access or Xcode, then obtain identities such as development certificates for testing, distribution certificates for App Store releases, or ad-hoc certificates for limited device installations without App Store involvement. These certificates embed the developer's Team ID in the subject organizational unit field, enabling the system to enforce trust chains during validation. For non-App Store macOS distribution, Developer ID Application or Installer certificates allow direct downloads while complying with Gatekeeper, requiring membership in the Apple Developer Program.[87][88][89] Xcode provides built-in code signing during the build process, automatically embedding signatures using thecodesign command-line tool for manual operations, which applies cryptographic hashes and certificates to binaries, bundles, and frameworks. Entitlements, defined in a .entitlements property list file, grant apps specific permissions like access to the camera or sandboxing, and Xcode merges these during signing to match provisioning profiles. For debugging, Xcode generates .dSYM files containing symbol information tied to the signed build, enabling symbolication of crash reports without exposing source code. Provisioning profiles, which are signed property lists combining certificates, app IDs, and device UDIDs, are essential for iOS and extend to macOS for capabilities like push notifications; they support development (for registered devices), ad-hoc (for limited distribution), and distribution types.[90][91][92]
Enforcement occurs at multiple levels: on macOS, System Integrity Protection (SIP) restricts modifications to system files and blocks loading of unsigned kernel extensions (kexts), requiring them to be signed with a Developer ID Kexts certificate and approved by users via System Preferences. Gatekeeper scans downloads for valid signatures and, since macOS Catalina (10.15) in 2019, mandates notarization—a cloud-based Apple review process that staples a ticket to signed apps, confirming absence of malware before Gatekeeper allows execution. On iOS, the system rejects unsigned or mismatched provisioning profile apps at installation, ensuring only authorized code runs on devices. This mandatory code signing requirement prevents persistent malware by ensuring all code is cryptographically signed with Apple-issued certificates; unsigned or malicious code cannot run persistently outside the app sandbox, and there is no easy path to achieving kernel-level or boot-level persistence without a jailbreak or a zero-day exploit chain.[93][94][95][96]
