Hubbry Logo
Code signingCode signingMain
Open search
Code signing
Community hub
Code signing
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Code signing
Code signing
from Wikipedia

Code signing is the process of digitally signing executables and scripts to confirm the software author and guarantee that the code has not been altered or corrupted since it was signed. The process employs the use of a cryptographic hash to validate authenticity and integrity.[1] Code signing was invented in 1995 by Michael Doyle, as part of the Eolas WebWish browser plug-in, which enabled the use of public-key cryptography to sign downloadable Web app program code using a secret key, so the plug-in code interpreter could then use the corresponding public key to authenticate the code before allowing it access to the code interpreter's APIs.[2][3]

Code signing can provide several valuable features. The most common use of code signing is to provide security when deploying; in some programming languages, it can also be used to help prevent namespace conflicts. Almost every code signing implementation will provide some sort of digital signature mechanism to verify the identity of the author or build system, and a checksum to verify that the object has not been modified. It can also be used to provide versioning information about an object or to store other metadata about an object.[4]

The efficacy of code signing as an authentication mechanism for software depends on the security of underpinning signing keys. As with other public key infrastructure (PKI) technologies, the integrity of the system relies on publishers securing their private keys against unauthorized access. Keys stored in software on general-purpose computers are susceptible to compromise. Therefore, it is more secure, and best practice, to store keys in secure, tamper-proof, cryptographic hardware devices known as hardware security modules or HSMs.[5]

Providing security

[edit]

Many code signing implementations will provide a way to sign the code using a system involving a pair of keys, one public and one private, similar to the process employed by TLS or SSH. For example, in the case of .NET, the developer uses a private key to sign their libraries or executables each time they build. This key will be unique to a developer or group or sometimes per application or object. The developer can either generate this key on their own or obtain one from a trusted certificate authority (CA).[6]

Code signing is particularly valuable in distributed environments, where the source of a given piece of code may not be immediately evident - for example Java applets, ActiveX controls and other active web and browser scripting code. Another important usage is to safely provide updates and patches to existing software.[7] Windows, Mac OS X, and most Linux distributions provide updates using code signing to ensure that it is not possible for others to maliciously distribute code via the patch system. It allows the receiving operating system to verify that the update is legitimate, even if the update was delivered by third parties or physical media (disks).[8]

Code signing is used on Windows and Mac OS X to authenticate software on first run, ensuring that the software has not been maliciously tampered with by a third-party distributor or download site. This form of code signing is not used on Linux because of that platform's decentralized nature, the package manager being the predominant mode of distribution for all forms of software (not just updates and patches), as well as the open-source model allowing direct inspection of the source code if desired. Debian-based Linux distributions (among others) validate downloaded packages using public key cryptography.[9]

Trusted identification using a certificate authority (CA)

[edit]

The public key used to authenticate the code signature should be traceable back to a trusted root authority CA, preferably using a secure public key infrastructure (PKI). This does not ensure that the code itself can be trusted, only that it comes from the stated source (or more explicitly, from a particular private key).[10] A CA provides a root trust level and is able to assign trust to others by proxy. If a user trusts a CA, then the user can presumably trust the legitimacy of code that is signed with a key generated by that CA or one of its proxies. Many operating systems and frameworks contain built-in trust for one or more certification authorities. It is also commonplace for large organizations to implement a private CA, internal to the organization, which provides the same features as public CAs, but it is only trusted within the organization.

Extended validation (EV) code signing

[edit]

Extended validation (EV) code signing certificates are subject to additional validation and technical requirements. These guidelines are based on the CA/B Forum's Baseline Requirements and Extended Validation Guidelines. In addition to validation requirements specific to EV, the EV code signing guidelines stipulate that "the Subscriber's private key is generated, stored and used in a crypto module that meets or exceeds the requirements of FIPS 140-2 level 2."[11]

Certain applications, such as signing Windows 10 kernel-mode drivers, require an EV code signing certificate.[12] Additionally, Microsoft's IEBlog states that Windows programs "signed by an EV code signing certificate can immediately establish reputation with SmartScreen reputation services even if no prior reputation exists for that file or publisher."[13]

Sample EV code signing certificate

[edit]

This is an example of a decoded EV code signing certificate used by SSL.com to sign software. SSL.com EV Code Signing Intermediate CA RSA R3 is shown as the Issuer's commonName, identifying this as an EV code signing certificate. The certificate's Subject field describes SSL Corp as an organization. Code Signing is shown as the sole X509v3 Extended Key Usage.

Certificate:
    Data:
        Version: 3 (0x2)
        Serial Number:
            59:4e:2d:88:5a:2c:b0:1a:5e:d6:4c:7b:df:35:59:7d
    Signature Algorithm: sha256WithRSAEncryption
        Issuer:
            commonName                = SSL.com EV Code Signing Intermediate CA RSA R3
            organizationName          = SSL Corp
            localityName              = Houston
            stateOrProvinceName       = Texas
            countryName               = US
        Validity
            Not Before: Aug 30 20:29:13 2019 GMT
            Not After : Nov 12 20:29:13 2022 GMT
        Subject:
            1.3.6.1.4.1.311.60.2.1.3 = US
            1.3.6.1.4.1.311.60.2.1.2 = Nevada
            streetAddress             = 3100 Richmond Ave Ste 503
            businessCategory          = Private Organization
            postalCode                = 77098
            commonName                = SSL Corp
            serialNumber              = NV20081614243
            organizationName          = SSL Corp
            localityName              = Houston
            stateOrProvinceName       = Texas
            countryName               = US
        Subject Public Key Info:
            Public Key Algorithm: rsaEncryption
                Public-Key: (2048 bit)
                Modulus:
                    00:c3:e9:ae:be:d7:a2:6f:2f:24 ...
                Exponent: 65537 (0x10001)
        X509v3 extensions:
            X509v3 Authority Key Identifier: 
                keyid:36:BD:49:FF:31:2C:EB:AF:6A:40:FE:99:C0:16:ED:BA:FC:48:DD:5F
                
            Authority Information Access: 
                CA Issuers - URI:http://www.ssl.com/repository/SSLcom-SubCA-EV-CodeSigning-RSA-4096-R3.crt
                OCSP - URI:http://ocsps.ssl.com
                
            X509v3 Certificate Policies: 
                Policy: 2.23.140.1.3
                Policy: 1.2.616.1.113527.2.5.1.7
                Policy: 1.3.6.1.4.1.38064.1.3.3.2
                  CPS: https://www.ssl.com/repository
                  
            X509v3 Extended Key Usage: 
                Code Signing
            X509v3 CRL Distribution Points: 
            
                Full Name:
                  URI:http://crls.ssl.com/SSLcom-SubCA-EV-CodeSigning-RSA-4096-R3.crl
                  
            X509v3 Subject Key Identifier: 
                EC:6A:64:06:26:A7:7A:69:E8:CC:06:D5:6F:FA:E1:C2:9A:29:79:DE
            X509v3 Key Usage: critical
                Digital Signature
    Signature Algorithm: sha256WithRSAEncryption
         17:d7:a1:26:58:31:14:2b:9f:3b ...

Alternative to CAs

[edit]

The other model is the trust on first use model, in which developers can choose to provide their own self-generated key. In this scenario, the user would normally have to obtain the public key in some fashion directly from the developer to verify the object is from them for the first time. Many code signing systems will store the public key inside the signature. Some software frameworks and OSs that check the code's signature before executing will allow you to choose to trust that developer from that point on after the first run. An application developer can provide a similar system by including the public keys with the installer. The key can then be used to ensure that any subsequent objects that need to run, such as upgrades, plugins, or another application, are all verified as coming from that same developer.

Time-stamping

[edit]

Time-stamping was designed to circumvent the trust warning that will appear in the case of an expired certificate. In effect, time-stamping extends the code trust beyond the validity period of a certificate.[14]

In the event that a certificate has to be revoked due to a compromise, a specific date and time of the compromising event will become part of the revocation record. In this case, time-stamping helps establish whether the code was signed before or after the certificate was compromised.[14]

Code signing in Xcode

[edit]

Developers need to sign their iOS and tvOS apps before running them on any real device and before uploading them to the App Store. This is needed to prove that the developer owns a valid Apple Developer ID. An application needs a valid profile or certificate so that it can run on the devices.[15]

Problems

[edit]

Like any security measure, code signing can be defeated. Users can be tricked into running unsigned code, or even into running code that refuses to validate, and the system only remains secure as long as the private key remains private.[16][17]

It is also important to note that code signing does not protect the end user from any malicious activity or unintentional software bugs by the software author — it merely ensures that the software has not been modified by anyone other than the author. Sometimes, sandbox systems do not accept certificates, because of a false time-stamp or because of an excess usage of RAM.

Implementations

[edit]

Microsoft implements a form of code signing (based on Authenticode) provided for Microsoft tested drivers. Since drivers run in the kernel, they can destabilize the system or open the system to security holes. For this reason, Microsoft tests drivers submitted to its WHQL program. After the driver has passed, Microsoft signs that version of the driver as being safe. On 32-bit systems only, installing drivers that are not validated with Microsoft is possible after agreeing to allow the installation at a prompt warning the user that the code is unsigned. For .NET (managed) code, there is an additional mechanism called Strong Name Signing that uses Public/Private keys and SHA-1 hash as opposed to certificates. However, Microsoft discourages reliance on Strong Name Signing as a replacement for Authenticode.[18]

The Code Signing Working Group of the CA/Browser Forum decided that starting June 1, 2023, all code signing certificates (not only the EA ones) should mandate private key storage on a physical media, such as in a hardware crypto module conforming to at least FIPS 140-2 Level 2 or Common Criteria EAL 4+.[19] The CAs subsequently issued announcements on compliance with the decision.[20][21][22][23][24][25][26]

Unsigned code in gaming and consumer devices

[edit]

In the context of consumer devices such as games consoles, the term "unsigned code" is often used to refer to an application which has not been signed with the cryptographic key normally required for software to be accepted and executed. Most console games have to be signed with a secret key designed by the console maker or the game will not load on the console (both to enforce Vendor lock-in and combat software piracy). There are several methods to get unsigned code to execute which include software exploits, the use of a modchip, a technique known as the swap trick or running a softmod.

It may not initially seem obvious why simply copying a signed application onto another DVD does not allow it to boot. On the Xbox, the reason for this is that the Xbox executable file (XBE) contains a media-type flag, which specifies the type of media that the XBE is bootable from. On nearly all Xbox software, this is set such that the executable will only boot from factory-produced discs, so simply copying the executable to burnable media is enough to stop the execution of the software.

However, since the executable is signed, simply changing the value of the flag is not possible as this alters the signature of the executable, causing it to fail validation when checked.

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Code signing is a cryptographic process used to digitally sign software executables, scripts, and other code artifacts to verify their authenticity and integrity, ensuring that the code originates from a trusted author and has not been altered or tampered with since signing. This technique employs digital signatures generated using a private key, paired with a issued by a trusted certification authority (CA), allowing verifiers such as operating systems or users to confirm the signer's identity and detect any modifications. In practice, code signing plays a critical role in the by mitigating risks from malicious alterations, unauthorized distribution, and attacks, as highlighted in security frameworks for , operating systems, mobile applications, and images. The process typically involves three key roles: the developer, who creates and submits the code; the signer, who applies the using protected private keys; and the verifier, who validates the signature against the signer's public key and certificate chain. Platforms like macOS require code signing for app distribution to enforce protections, while Microsoft's Authenticode enables similar verification for Windows drivers and executables, often embedding signatures in catalog files to support integrity checks without altering the core binaries. Beyond basic verification, code signing supports advanced features such as timestamping from a Time Stamp Authority (TSA) to prove the exact signing time, enhancing long-term validity even after certificate expiration, and integration with modules (HSMs) for key protection against theft or compromise. Security considerations include selecting robust cryptographic algorithms, managing trust anchors through root CAs, and conducting regular audits to prevent issues like rogue certificates or weak key generation, which have been implicated in major incidents. Overall, widespread adoption of code signing strengthens ecosystem trust, with requirements enforced by major vendors to block unsigned or invalidly signed code from execution.

Fundamentals

Definition and Purpose

Code signing is a security process in which software developers attach a to executables, binaries, or scripts, employing to verify the software's origin and ensure it has not been altered or tampered with since signing. This mechanism allows end-users and systems to confirm that the code originates from a legitimate source, thereby distinguishing trusted software from potentially malicious alterations during distribution. The primary purposes of code signing are to guarantee software by detecting any post-signing modifications, authenticate the developer's identity to establish , and foster trust in channels by preventing from masquerading as legitimate applications. By embedding this cryptographic assurance, code signing mitigates risks associated with unverified code execution, such as the introduction of vulnerabilities or unauthorized changes. Code signing emerged in the mid-1990s alongside the development of digital signature standards like , which was published by in the early 1990s. It gained widespread adoption in the late 1990s for enterprise software distribution, driven by the need to secure executable content in growing networked environments, with technologies like Microsoft's Authenticode introduced in 1996. This process typically relies on digital certificates issued by trusted certificate authorities to bind the signature to a verified identity. Among its key benefits, code signing significantly reduces the risk of executing malicious or compromised code by providing verifiable proof of unaltered software. It also enables operating systems and platforms to implement execution policies, such as restricting or blocking the running of unsigned applications to enhance overall system .

Technical Mechanism

The technical mechanism of code signing relies on asymmetric cryptography to ensure the integrity and authenticity of software executables, scripts, or other artifacts. Developers begin by generating a public-private key pair using established cryptographic libraries, where the private key remains secret and the public key is associated with a digital certificate. The is then processed through a hashing algorithm to produce a fixed-size digest representing its contents; for example, SHA-256 is commonly used to generate a 256-bit hash value that uniquely identifies the unaltered . This hash is encrypted with the developer's private key to create a digital signature, which serves as proof that the has not been modified since signing. The signature, along with the associated , is embedded into the 's metadata structure, such as the (PE) format for Windows binaries, forming a self-contained signed package. Verification occurs at runtime or during installation when the receiving system recomputes the hash of the current code and compares it to the hash extracted from the embedded . If the hashes match, the system decrypts the using the public key to retrieve the original hash and confirm its validity, thereby establishing the code's integrity. The process then validates the public key's certificate chain, tracing back through intermediate certificates to a trusted authority (CA) to ensure the signer's identity is authentic and the certificate has not expired or been revoked. This chain validation relies on standards like , which defines the structure for public key certificates including fields for the subject's name, public key, validity period, and issuer . Code signing employs standardized formats to encapsulate signatures and certificates, primarily the as specified in RFC 5652, which evolved from and supports signed data structures with multiple signers, digest algorithms, and optional attributes. In CMS, the SignedData content type includes the encapsulated content info, certificates, and signer infos, where each signer info contains the signature value computed over the digest and signed attributes. Hash algorithms have evolved to address security vulnerabilities; early implementations used (128-bit) and later (160-bit), but due to collision attacks, modern code signing mandates stronger algorithms like SHA-256 from the family or for enhanced resistance to cryptanalytic attacks. Practical implementation involves tools for generating and applying signatures. , an open-source library, provides command-line utilities like openssl cms for creating CMS/ signatures on arbitrary data, enabling custom code signing workflows. These tools integrate with build systems such as Maven or , allowing automated signing during compilation to embed signatures without manual intervention.

Security Features

Certificate Authorities and Trusted Identification

Certificate authorities (CAs) serve as trusted third-party entities that verify the identity of software developers or organizations before issuing digital certificates for code signing. Examples include and Sectigo, which act as independent validators to ensure that only legitimate entities receive these certificates. The primary role of a CA in this context is to perform on the applicant's identity, thereby establishing a foundation of trust that allows end-users and systems to authenticate the origin and integrity of signed code without direct knowledge of the signer. The trust model underpinning code signing certificates relies on a hierarchical chain within the (PKI). An end-user code signing certificate is digitally signed by an intermediate CA, which is itself signed by higher-level intermediates or ultimately by a root CA. Root CAs are pre-trusted, with their public keys embedded in operating system and application trust stores, such as those in Windows or Apple ecosystems. This chain enables verifiers to recursively validate each certificate against the issuer's public key, culminating in confirmation against the trusted root, thus preventing forgery or impersonation in the signing process. The issuance process begins with the developer generating a key pair and submitting a (CSR) along with proof of identity to the CA. For organizations, this typically includes business registration documents, tax IDs, or addresses; for individuals, government-issued photo identification such as passports or driver's licenses is required. Per CA/B Forum Baseline Requirements, effective June 1, 2023, the private key must be generated, stored, and used exclusively within a cryptographic module certified to Level 2 or EAL 4+ to protect against compromise. The CA then conducts organization validation (OV), which involves confirming the entity's legal existence, operational address, and operational control through independent sources like or phone verification. Upon approval, the CA issues the certificate, which embeds the developer's public key, distinguished name, , and a validity period—historically up to 39 months, though the has mandated a reduction to a maximum of 460 days for certificates issued after March 1, 2026. To address compromised or invalid certificates, CAs implement revocation mechanisms that allow real-time or periodic checks of certificate status. Certificate Revocation Lists (CRLs) are digitally signed files published by the CA at regular intervals, listing the serial numbers of revoked certificates along with revocation reasons and dates. Alternatively, the Online Certificate Status Protocol (OCSP) enables on-demand queries to the CA's server for the status of a specific certificate, providing responses such as "good," "revoked," or "unknown." These tools ensure that systems can detect and reject signatures from invalidated certificates, maintaining the overall security of the code signing ecosystem.

Extended Validation Certificates

Extended Validation (EV) certificates for code signing represent a high-assurance standard established by the CA/B Forum, requiring certificate authorities to perform thorough identity vetting of the applicant . This verifies legal existence by confirming registration with the relevant incorporating or registration agency in the subject's , physical existence through validation of a presence at a specified address, and operational existence to ensure active operations as of the issuance date. The vetting, which involves document review, database checks, and potential phone verification, typically spans several days to a week or more, depending on the applicant's responsiveness and the complexity of the . In contrast to Organization Validated (OV) or Domain Validated (DV) certificates, which rely on less stringent checks like basic domain control or organizational details, EV certificates mandate audited compliance with CA/B Forum guidelines, including ongoing CA process audits for reliability. This results in certificates featuring unique identifiers, such as the EV policy (OID) 2.23.140.1.1, enabling operating systems to recognize and afford elevated trust to EV-signed code. Key fields in these certificates include the subject organization name, for uniqueness, and additional attributes like of incorporation and components, all encoded to provide verifiable transparency without including domain names. EV-signed executables in Microsoft Windows environments display the verified organization name as the publisher in (UAC) prompts, replacing generic "unknown" warnings with identifiable details, while also receiving immediate positive reputation from to minimize or eliminate download and execution alerts. This visual and behavioral trust enhancement helps users confidently identify legitimate software publishers. Adoption of EV code signing certificates is common in enterprise software development, where they are often required for distribution through platforms like the or for compliance in regulated industries to demonstrate rigorous identity assurance. Certificate authorities such as Entrust and provide these certificates, with annual pricing typically ranging from $300 to $500, reflecting the intensive validation and requirements.

Time-Stamping Protocols

Time-stamping protocols in code signing attach a trusted timestamp to a , proving that the signature was created at a specific point in time and enabling verification even after the signing certificate expires. This is achieved through a Time-Stamping Authority (TSA), a that generates time-stamp tokens using a reliable time source, as defined in the Internet Public Key Infrastructure Time-Stamp Protocol (TSP) outlined in RFC 3161. Per CA/B Forum Baseline Requirements, effective April 15, 2025, TSA private keys for Root and Subordinate CA certificates (with validity over 72 months) must be protected in a hardware cryptographic module certified to Level 3 or EAL 4+, maintained in a high-security zone. Examples of TSAs include free services like FreeTSA.org, which provides RFC 3161-compliant timestamps without cost for basic use, and commercial providers such as Sectigo (formerly Comodo), which offers timestamping via http://timestamp.sectigo.com.[](https://www.freetsa.org/index_en.php)[](https://www.sectigo.com/resource-library/time-stamping-server)[](https://cabforum.org/uploads/Baseline-Requirements-for-the-Issuance-and-Management-of-Code-Signing.v3.9.pdf) The process begins after the code is signed with a private key; the signer submits a hash of the (typically using SHA-256 in modern implementations) to the TSA via an HTTP or TCP request formatted according to RFC 3161. The TSA verifies the request, appends the current UTC time from a trusted source (such as NTP-synchronized clocks), signs the hash with its own certificate, and returns a TimeStampToken containing the information, including a serial number for uniqueness and the hashing algorithm used. This token is then embedded into the envelope, often as an unsigned attribute in CMS/ structures, ensuring the is cryptographically bound to the original . These protocols provide several benefits for code signing security. By establishing the exact creation time of the signature, time-stamping prevents replay attacks, as verifiers can check that the aligns with the expected temporal context and detect any attempts to reuse outdated signatures. It also supports long-term validity, allowing signatures to be verified post-certificate expiration as long as the falls within the certificate's validity period and the TSA's certificate remains trustworthy, which is crucial for archival of software artifacts. Integration of time-stamping is seamless in common tools; for instance, Microsoft's SignTool.exe automates the process using the /tr option to specify a TSA , such as http://timestamp.sectigo.com, and supports SHA-256 hashing for requests without additional configuration for basic services. Many TSAs, including FreeTSA.org and Sectigo, default to SHA-256 for compatibility and security, offering no-cost options for non-commercial or low-volume use while ensuring compliance with RFC 3161 standards.

Alternatives to Certificate Authorities

Self-signed certificates represent a basic alternative to traditional Certificate Authorities (CAs) in code signing, where developers generate their own public-private key pair and certificate using tools like OpenSSL or PowerShell's New-SelfSignedCertificate cmdlet. These certificates are suitable for internal tools, development, or testing environments, as they allow signing without external validation, but they inherently lack third-party trust since no CA vouches for the issuer's identity. Verification depends on manual distribution of the public key to recipients, who must explicitly trust it by importing it into their local certificate store, such as the Trusted People store on Windows. Web of trust models, inspired by Pretty Good Privacy (PGP), provide a decentralized approach where users mutually vouch for each other's public keys through signatures, forming chains of trust without a central authority. In open-source projects, this is implemented via tools like GnuPG, with keys distributed through keyservers or repositories; for instance, the community uses PGP signatures on tags and tarballs, relying on the to verify maintainer identities post the 2011 kernel.org compromise. Trust levels are calculated based on signature paths from known trusted keys, enabling collaborative verification in ecosystems like distributions where developers sign each other's keys to build collective assurance. Decentralized options extend this further by leveraging distributed technologies for identity and verification, bypassing CA hierarchies altogether. Projects like Sigstore enable keyless code signing through OpenID Connect (OIDC) providers for identity proof, issuing short-lived certificates via Fulcio and logging signatures in the tamper-evident Rekor transparency log for public auditability. Blockchain-based methods, such as anchoring code hashes or signatures to for timestamping, provide immutable proof of existence and integrity without centralized issuance, often combined with smart contracts for verification. Modules (HSMs) support these by securely generating and storing keys in tamper-resistant hardware, facilitating self-signed or decentralized signing while ensuring private keys never leave the device. These alternatives offer significant trade-offs compared to CA-based systems: they reduce costs and accelerate issuance by eliminating vetting processes, making them ideal for open-source or internal use, as seen in Git's support for GPG-signed commits where developers verify authenticity via personal keyrings. However, they increase risks of impersonation due to the absence of independent identity validation, requiring robust and user diligence to mitigate potential threats.

Challenges and Limitations

Common Security Problems

One major vulnerability in code signing arises from the theft or compromise of private keys associated with code signing certificates. When attackers gain access to these keys, they can sign malicious code as if it originated from a trusted entity, bypassing verification mechanisms and enabling widespread distribution of malware. A prominent example is the 2011 breach of DigiNotar, a Dutch certificate authority, where intruders compromised the private keys and issued over 500 fraudulent certificates, including code signing ones, affecting millions of users primarily through man-in-the-middle attacks on services like Gmail in Iran. This incident led to the revocation of DigiNotar's root certificates across major trust stores and the company's bankruptcy. Similarly, the 2020 SolarWinds supply chain attack involved Russian state-sponsored actors injecting malware into legitimate software updates, which were then signed using SolarWinds' legitimate code signing certificate after compromising the build process, compromising thousands of organizations including U.S. government agencies. In 2023, attackers stole encrypted code signing certificates from GitHub, including those for GitHub Desktop and Atom, potentially allowing malicious software to be signed as legitimate GitHub releases; GitHub revoked the certificates and advised users to update affected software. Algorithmic weaknesses in hashing functions used for code signing signatures further exacerbate risks. Deprecated algorithms like are susceptible to collision attacks, where attackers generate two different files with identical hashes, allowing substitution of malicious code without invalidating the signature. The 2017 SHAttered attack demonstrated the first practical collision for , producing two distinct PDFs with the same hash, highlighting its vulnerability for digital signatures including code signing; despite transitions to stronger hashes like SHA-256, legacy -signed code remains in use, delaying full mitigation. Timestamping failures can undermine the long-term validity of code signatures by failing to provide reliable proof of signing time relative to certificate expiration or . Outages or connectivity issues with Time-Stamping Authorities (TSAs) prevent acquisition of valid during signing, rendering signatures time-bound to the certificate's validity period and potentially invalidating them prematurely. For instance, the 2019 expiration of Comodo's TSA certificate (timestamp.comodoca.com) caused widespread errors and outages in timestamped code validation across various environments. Additionally, use of untrusted or compromised TSAs allows attackers to forge ; in one described scenario, an adversary intercepts timestamp requests and supplies a response from a non-trustworthy TSA, leading verifiers to accept invalid signatures. Other systemic issues include signature stripping in repackaged , where attackers decompile legitimate signed applications, remove the original , inject malicious payloads, and redistribute the altered unsigned or re-signed binaries to evade detection. Over-reliance on centralized trust stores amplifies risks from root CA compromises; the 2015 Symantec incidents involved multiple misissuances of rogue certificates, including an unauthorized for google.com issued without proper validation, prompting employee terminations and widespread distrust of Symantec roots by browsers like Chrome. These events exposed how flaws in CA operations can propagate untrusted certificates into trust stores, enabling fake code signing.

Mitigation Strategies

Mitigation strategies for code signing vulnerabilities focus on proactive measures to protect private keys, ensure cryptographic robustness, integrate verification into development workflows, and enable rapid detection and response to compromises. These practices help developers and organizations minimize risks such as unauthorized code distribution and attacks by emphasizing secure handling, standards compliance, and ongoing monitoring. Key management is a cornerstone of code signing security, beginning with the use of Hardware Security Modules (HSMs) for private key storage to prevent unauthorized access and extraction. HSMs provide tamper-resistant environments that isolate keys from software-based threats, ensuring that signing operations occur within protected hardware. Regular key rotation—typically every 1-2 years or after potential exposure—limits the impact of a compromised key by reducing its lifespan and validity period. Additionally, enabling (MFA) for (CA) accounts and key access controls adds layers of identity verification, thwarting credential-based attacks. Updating cryptographic algorithms addresses evolving threats to hashing integrity, with a mandate to transition to SHA-256 or stronger variants following the 2017 SHAttered on , which demonstrated practical forgery risks for code signing. The National Institute of Standards and Technology (NIST) deprecated for digital signatures in 2013 and fully retired it by December 31, 2030, urging immediate adoption of and families to maintain . accelerated this by deprecating SHA-1 code signing support in 2017, requiring SHA-256 for new certificates to align with browser and OS enforcement. Organizations should monitor NIST guidelines and conduct periodic audits to ensure compliance with these post-2017 standards. Enhancing verification involves embedding code signing policies directly into / (CI/CD) pipelines to automate integrity checks during builds and deployments. Tools like Cosign, developed by the Sigstore project, facilitate container image signing and verification without long-term key management, using short-lived keys and transparency logs for reproducible attestations. This integration ensures that only signed artifacts proceed to production, reducing the window for tampering in automated workflows. For incident response, continuous monitoring of (OCSP) responders and Certificate Revocation Lists (CRLs) is essential to detect and enforce revocations promptly, as OCSP provides real-time status queries while CRLs offer batch updates for offline validation. Supply chain risk audits, guided by the Supply-chain Levels for Software Artifacts (SLSA) framework introduced in 2021, evaluate build provenance and integrity controls to identify weaknesses before deployment. SLSA's tiered levels promote verifiable builds and signed artifacts, enabling organizations to respond to breaches by revoking affected certificates and tracing impacted distributions.

Implementations

Apple Ecosystems

In Apple's ecosystems, code signing is a mandatory mechanism for distributing and executing software on macOS and platforms, ensuring that applications originate from verified developers and remain untampered. It integrates deeply with the distribution model, where all submitted apps must be signed using Apple-issued certificates to pass review and installation checks. For macOS, enforces signing by verifying Developer ID certificates on downloaded apps, preventing execution of unsigned or tampered code outside the . Similarly, requires signed apps bundled with provisioning profiles to install on devices, tying code to specific developer identities and device capabilities. Certificates for code signing are issued by the Apple Worldwide Developer Relations (WWDR) Certification Authority, an intermediate authority under Apple's that validates developer identities through the Apple Developer Program. Developers generate certificate signing requests via Keychain Access or , then obtain identities such as development certificates for testing, distribution certificates for releases, or ad-hoc certificates for limited device installations without involvement. These certificates embed the developer's Team ID in the subject organizational unit field, enabling the system to enforce trust chains during validation. For non- macOS distribution, Developer ID Application or Installer certificates allow direct downloads while complying with , requiring membership in the Apple Developer Program. Xcode provides built-in code signing during the build process, automatically embedding signatures using the codesign command-line tool for manual operations, which applies cryptographic hashes and certificates to binaries, bundles, and frameworks. Entitlements, defined in a .entitlements file, grant apps specific permissions like access to the camera or sandboxing, and merges these during signing to match provisioning profiles. For , generates .dSYM files containing information tied to the signed build, enabling symbolication of crash reports without exposing . Provisioning profiles, which are signed s combining certificates, app IDs, and device UDIDs, are essential for and extend to macOS for capabilities like push notifications; they support development (for registered devices), ad-hoc (for limited distribution), and distribution types. Enforcement occurs at multiple levels: on macOS, (SIP) restricts modifications to system files and blocks loading of unsigned kernel extensions (kexts), requiring them to be signed with a Developer ID Kexts certificate and approved by users via System Preferences. scans downloads for valid signatures and, since (10.15) in 2019, mandates notarization—a cloud-based Apple process that staples a ticket to signed apps, confirming absence of before allows execution. On , the system rejects unsigned or mismatched provisioning profile apps at installation, ensuring only authorized code runs on devices. This mandatory code signing requirement prevents persistent malware by ensuring all code is cryptographically signed with Apple-issued certificates; unsigned or malicious code cannot run persistently outside the app sandbox, and there is no easy path to achieving kernel-level or boot-level persistence without a jailbreak or a zero-day exploit chain.

Microsoft Windows

Code signing in Microsoft Windows primarily relies on the Authenticode framework, which was introduced in 1996 to enable publishers to digitally sign software components, verifying their origin and integrity. Authenticode supports signing of /Common Object File Format (PE/COFF) files such as executables (.exe) and dynamic-link libraries (.dll), cabinet archives (.cab), and scripts through Subject Interface Packages (SIP). The framework uses , where a publisher's private key signs a hash of the file, and the corresponding certificate chains to a trusted root authority. Signing is typically performed using the SignTool.exe command-line utility, which includes the /t option to apply time-stamping from a trusted authority, ensuring the signature remains valid even after certificate expiration. Certificates for Authenticode signing are issued by trusted certificate authorities (CAs) primarily in Organization Validation (OV) types, following the deprecation of Extended Validation (EV) code signing certificates by in 2024. The traditional Software Publisher Certificate (SPC) format encapsulates the public key and is paired with a private key file (often in .pvk format) for signing operations. For cross-operating system compatibility, such as supporting both Windows and macOS distributions of the same software, dual-signing applies multiple signatures—using distinct certificates tailored to each platform's requirements—allowing verification across environments without altering the binary. While Authenticode signing is optional for most user-mode applications, it is strongly recommended to avoid security warnings; the Windows SmartScreen filter blocks or warns about unsigned downloads from untrusted sources to protect against . In contrast, kernel-mode driver signing has been mandatory on 64-bit editions since in 2007, requiring digital signatures to load and often involving Windows Hardware Quality Labs ( for certification. SignTool supports batch signing of multiple files via wildcards or lists, streamlining the process for large projects. For scenarios where embedding signatures would modify binaries, catalog signing uses a separate .cat file to hash and sign an entire collection of unmodified files, preserving their original state while enabling verification. Code signing certificates allow the verified publisher name to be displayed in the User Account Control (UAC) prompt since Windows 8 in 2012, helping users distinguish legitimate software from potentially malicious executables requiring administrative privileges.

Other Platforms and Tools

In Android, applications are distributed as APK files that must be digitally signed to ensure integrity and authenticity. The platform supports multiple signing schemes, including the v1 scheme based on JAR signing, which verifies individual file signatures, and the v2 scheme introduced in Android 7.0 (API level 24) in 2016, which computes a full APK hash for more efficient and secure verification. Since Android 8.0 (API level 26) in 2017, the v2 scheme has been recommended, with subsequent v3 and v4 schemes adding support for incremental updates and additional metadata. The apksigner tool, part of the Android SDK Build Tools since revision 24.0.3, is used to sign and verify APKs, supporting algorithms like RSA (1024 to 16384 bits) and EC (NIST P-256, P-384, P-521). For distribution on the Google Play Store, apps require signing, often managed through Play App Signing where Google holds the upload key and uses its own keys for production release, ensuring attestation of the app's origin and integrity. On and systems, code signing varies by distribution and package format but commonly relies on GPG for package integrity. RPM-based distributions like and use GPG keys to sign packages, with tools like rpm-sign enabling developers to generate and apply signatures during builds, verifiable via rpm -K. DEB-based systems such as and employ GPG for repository signatures and package verification, often through tools like dpkg-sig or debsign, ensuring APT sources check keys before installation. Application bundling formats like use GPG-encrypted metadata signatures to verify remote repositories and app authenticity during installation and updates. Similarly, Snap packages from incorporate cryptographic signatures in their metadata, leveraging the snapd daemon to validate snaps against the Snap Store's assertions. For kernel modules, Secure Boot enforcement since 3.7 in 2012 uses the Machine Owner Key () mechanism via the shim , allowing users to enroll custom keys and sign modules with tools like sign-file to load unsigned or third-party drivers securely. Other tools extend code signing to specific environments. Java applications packaged as files are signed using the jarsigner utility, which applies signatures to the archive's manifest and contents, requiring a private key from a keystore generated by keytool; verification ensures no tampering since signing. applications, built for cross-platform desktop use, implement custom signing workflows integrated into packaging tools like electron-builder or electron-forge, applying platform-specific certificates (e.g., for macOS or Windows) to executables and installers during the build process. In container ecosystems, Docker's original image signing via (based on The Update Framework) was deprecated in 2020 due to maintenance challenges, with sigstore's Cosign now recommended for signing OCI-compliant images using short-lived keys and transparency logs, verifiable with cosign verify commands. Cross-platform build systems incorporate automated signing via plugins. Gradle's Signing Plugin digitally signs artifacts like JARs or publications using GPG or PGP keys, integrating with tasks to sign during maven-publish or ivy-publish workflows for repository uploads. Maven similarly uses the gpg-maven-plugin or Javadoc's built-in signing to apply GPG signatures to artifacts, a requirement for publishing to Maven Central to verify authenticity against the project's keys.

Exceptions and Unsigned Code

Use in Gaming and Consumer Devices

In gaming consoles, code signing serves as a critical measure to prevent unauthorized software execution, though practices vary by platform to accommodate development and homebrew needs. Official PlayStation development using devkits requires signing with provided certificates to ensure compatibility and before deployment on the console, while homebrew typically bypasses these requirements through exploits. Similarly, Xbox consoles in developer mode support the installation of homebrew via (UWP) apps, which require digital signing—often using self-signed certificates for non-commercial projects—to activate and run custom content. The strictly enforces code signing for its and official applications, verifying signatures during boot to block tampering; however, homebrew communities use signature patches (sigpatches) to bypass these checks, enabling like Atmosphere while maintaining partial enforcement on core system components. As of November 2025, sigpatches continue to be updated for recent versions such as 20.0+. In PC gaming, unsigned modifications remain prevalent due to the open nature of the ecosystem, allowing players to alter game files without formal signing. Tools such as facilitate this by enabling memory scanning and code injection for cheats or mods, operating without mandatory signatures as Windows primarily flags but does not block such user-initiated changes in non-store contexts. Consumer devices often relax code signing for performance and legacy compatibility. Smart TVs based on require all applications, including OEM preinstalled ones, to be signed for distribution, but manufacturers streamline the process with internal keys to avoid overhead, permitting faster updates for proprietary apps. In IoT devices like routers, code signing is frequently omitted to prioritize boot speed and resource efficiency on constrained hardware; custom firmware such as can be flashed without signing verification, supporting open-source modifications across various router models. Examples illustrate these flexible applications. The platform permits the upload and distribution of unsigned indie games, though developers are encouraged to sign executables to mitigate Windows SmartScreen warnings and enhance trustworthiness for end-users. For mobile gaming, apps submitted to stores like must undergo code signing to verify and prevent tampering, but emulators allow developers to test unsigned or debug-signed APKs in controlled environments, bypassing store-level requirements during iteration. As of October 2025, Google requires developer registration for non-ADB of APKs, further limiting unsigned app installation outside stores. Legacy hardware, such as older PlayStation or models, continues to tolerate unsigned code through exploits or dev modes, balancing security with .

Reasons for Bypassing Signing Requirements

Code signing requirements are sometimes bypassed in scenarios where the associated overheads outweigh the perceived benefits, particularly in resource-constrained or controlled settings. Developers may opt to omit signing during early stages of software creation to streamline workflows, as the process involves generating cryptographic hashes and embedding signatures, which can extend build durations significantly. For instance, in environments, code signing has been observed to increase total build times by up to 350% compared to unsigned builds, prompting teams to disable it for iterative development cycles. Similarly, the addition of digital signatures typically enlarges file sizes due to appended metadata and certificates, with increases noted in updates for resource-limited devices, where even modest expansions can strain storage or transmission bandwidth. In real-time systems such as embedded devices, the performance implications of code signing are especially pronounced, as verification processes introduce latency during execution or sequences. Cryptographic operations required for signing and validation can impose measurable delays, potentially affecting load times by several percentage points in low-power environments, where computational resources are tightly optimized. To mitigate this, developers of such systems may forgo signing altogether, prioritizing minimal overhead over formal attestation, particularly when the code operates in isolated or non-networked contexts. This approach is common in prototypes or custom hardware where rapid iteration is essential, and the risk of tampering is low due to controls. Legacy compatibility further incentivizes bypassing signing mandates, as older software binaries and hardware platforms predating widespread adoption of digital signatures—such as those from the pre-2000s era—often lack the necessary infrastructure for enforcement. For example, pre-SHA-256 systems may reject modern signed code or require deprecated algorithms like , leading developers to distribute unsigned versions to ensure seamless operation on outdated infrastructure. Additionally, in regions with restrictive app distribution policies, unsigned applications circumvents store-based signing requirements, allowing direct installation without compatibility hurdles. The financial and administrative burdens of obtaining valid certificates also drive decisions to skip signing, especially for independent or small-scale developers. Publicly trusted code signing certificates from certificate authorities typically cost between $129 and $864 annually, depending on the provider and validation level, creating a barrier for hobbyists or startups with limited budgets. Open-source projects frequently encounter these challenges, opting for self-signed certificates or entirely unsigned distributions to facilitate easy redistribution and collaboration without incurring fees or managing key lifecycles. This simplifies versioning and community contributions, though it relies on alternative trust mechanisms like source code audits. Finally, in low-threat environments, a deliberate may conclude that code signing provides negligible value, justifying its omission to reduce complexity. Internal tools deployed within enterprise networks or air-gapped systems, for instance, face minimal external tampering risks, allowing developers to prioritize functionality over attestation during prototyping or testing phases. Research prototypes similarly benefit from unsigned builds, as the focus remains on experimental validation rather than production-grade , with signing deferred until deployment if needed at all. Apple’s guidelines, for example, recommend self-signed identities for development to avoid premature use of production certificates in such controlled settings.

References

Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.