Capability Maturity Model Integration
View on Wikipedia
| Part of a series on |
| Software development |
|---|
Capability Maturity Model Integration (CMMI) is a process level improvement training and appraisal program. Administered by the CMMI Institute, a subsidiary of ISACA, it was developed at Carnegie Mellon University (CMU). It is required by many U.S. Government contracts, especially in software development. CMU claims CMMI can be used to guide process improvement across a project, division, or an entire organization.
CMMI defines the following five maturity levels (1 to 5) for processes: Initial, Managed, Defined, Quantitatively Managed, and Optimizing. CMMI Version 3.0 was published in 2023;[1] Version 2.0 was published in 2018; Version 1.3 was published in 2010, and is the reference model for the rest of the information in this article. CMMI is registered in the U.S. Patent and Trademark Office by CMU.[2]
Overview
[edit]
Originally CMMI addresses three areas of interest:
- Product and service development – CMMI for Development (CMMI-DEV),
- Service establishment, management, – CMMI for Services (CMMI-SVC), and
- Product and service acquisition – CMMI for Acquisition (CMMI-ACQ).
In version 2.0 these three areas (that previously had a separate model each) were merged into a single model.
CMMI was developed by a group from industry, government, and the Software Engineering Institute (SEI) at CMU. CMMI models provide guidance for developing or improving processes that meet the business goals of an organization. A CMMI model may also be used as a framework for appraising the process maturity of the organization.[3] By January 2013, the entire CMMI product suite was transferred from the SEI to the CMMI Institute, a newly created organization at Carnegie Mellon.[4]
History
[edit]CMMI was developed by the CMMI project, which aimed to improve the usability of maturity models by integrating many different models into one framework. The project consisted of members of industry, government and the Carnegie Mellon Software Engineering Institute (SEI). The main sponsors included the Office of the Secretary of Defense (OSD) and the National Defense Industrial Association.
CMMI is the successor of the capability maturity model (CMM) or Software CMM. The CMM was developed from 1987 until 1997. In 2002, version 1.1 was released, version 1.2 followed in August 2006, and version 1.3 in November 2010. Some major changes in CMMI V1.3 [5] are the support of agile software development,[6] improvements to high maturity practices[7] and alignment of the representation (staged and continuous).[8]
According to the Software Engineering Institute (SEI, 2008), CMMI helps "integrate traditionally separate organizational functions, set process improvement goals and priorities, provide guidance for quality processes, and provide a point of reference for appraising current processes."[9]
Mary Beth Chrissis, Mike Konrad, and Sandy Shrum Rawdon were the authorship team for the hard copy publication of CMMI for Development Version 1.2 and 1.3. The Addison-Wesley publication of Version 1.3 was dedicated to the memory of Watts Humphry. Eileen C. Forrester, Brandon L. Buteau, and Sandy Shrum were the authorship team for the hard copy publication of CMMI for Services Version 1.3. Rawdon "Rusty" Young was the chief architect for the development of CMMI version 2.0. He was previously the CMMI Product Owner and the SCAMPI Quality Lead for the Software Engineering Institute.
In March 2016, the CMMI Institute was acquired by ISACA.
In April 2023, the CMMI V3.0 was released.
Topics
[edit]Representation
[edit]In version 1.3 CMMI existed in two representations: continuous and staged.[3] The continuous representation is designed to allow the user to focus on the specific processes that are considered important for the organization's immediate business objectives, or those to which the organization assigns a high degree of risks. The staged representation is designed to provide a standard sequence of improvements, and can serve as a basis for comparing the maturity of different projects and organizations. The staged representation also provides for an easy migration from the SW-CMM to CMMI.[3]
In version 2.0 the above representation separation was cancelled and there is now only one cohesive model.[10]
Model framework (v1.3)
[edit]Depending on the areas of interest (acquisition, services, development) used, the process areas it contains will vary.[11] Process areas are the areas that will be covered by the organization's processes. The table below lists the seventeen CMMI core process areas that are present for all CMMI areas of interest in version 1.3.
| Abbreviation | Process Area | Category | Maturity level |
|---|---|---|---|
| CAR | Causal Analysis and Resolution | Support | 5 |
| CM | Configuration Management | Support | 2 |
| DAR | Decision Analysis and Resolution | Support | 3 |
| IPM | Integrated Project Management | Project Management | 3 |
| MA | Measurement and Analysis | Support | 2 |
| OPD | Organizational Process Definition | Process Management | 3 |
| OPF | Organizational Process Focus | Process Management | 3 |
| OPM | Organizational Performance Management | Process Management | 5 |
| OPP | Organizational Process Performance | Process Management | 4 |
| OT | Organizational Training | Process Management | 3 |
| PMC | Project Monitoring and Control | Project Management | 2 |
| PP | Project Planning | Project Management | 2 |
| PPQA | Process and Product Quality Assurance | Support | 2 |
| QPM | Quantitative Project Management | Project Management | 4 |
| REQM | Requirements Management | Project Management | 2 |
| RSKM | Risk Management | Project Management | 3 |
| SAM | Supplier Agreement Management | Support | 2 |
Maturity levels for services
[edit]The process areas below and their maturity levels are listed for the CMMI for services model:
Maturity Level 2 – Managed
- CM – Configuration Management
- MA – Measurement and Analysis
- PPQA – Process and Quality Assurance
- REQM – Requirements Management
- SAM – Supplier Agreement Management
- SD – Service Delivery
- WMC – Work Monitoring and Control
- WP – Work Planning
Maturity Level 3 – Defined
- CAM – Capacity and Availability Management
- DAR – Decision Analysis and Resolution
- IRP – Incident Resolution and Prevention
- IWM – Integrated Work Managements
- OPD – Organizational Process Definition
- OPF – Organizational Process Focus...
- OT – Organizational Training
- RSKM – Risk Management
- SCON – Service Continuity
- SSD – Service System Development
- SST – Service System Transition
- STSM – Strategic Service Management
Maturity Level 4 – Quantitatively Managed
- OPP – Organizational Process Performance
- QWM – Quantitative Work Management
Maturity Level 5 – Optimizing
- CAR – Causal Analysis and Resolution.
- OPM – Organizational Performance Management.
Models (v1.3)
[edit]CMMI best practices are published in documents called models, each of which addresses a different area of interest. Version 1.3 provides models for three areas of interest: development, acquisition, and services.
- CMMI for Development (CMMI-DEV), v1.3 was released in November 2010. It addresses product and service development processes.
- CMMI for Acquisition (CMMI-ACQ), v1.3 was released in November 2010. It addresses supply chain management, acquisition, and outsourcing processes in government and industry.
- CMMI for Services (CMMI-SVC), v1.3 was released in November 2010. It addresses guidance for delivering services within an organization and to external customers.
Model (v2.0)
[edit]In version 2.0 DEV, ACQ and SVC were merged into a single model where each process area potentially has a specific reference to one or more of these three aspects. Trying to keep up with the industry the model also has explicit reference to agile aspects in some process areas.
Some key differences between v1.3 and v2.0 models are given below:
- "Process Areas" have been replaced with "Practice Areas (PA's)". The latter is arranged by levels, not "Specific Goals".
- Each PA is composed of a "core" [i.e. a generic and terminology-free description] and "context-specific" [ i.e. description from the perspective of Agile/ Scrum, development, services, etc.] section.
- Since all practices are now compulsory to comply, "Expected" section has been removed.
- "Generic Practices" have been put under a new area called "Governance and Implementation Infrastructure", while "Specific practices" have been omitted.
- Emphasis on ensuring implementation of PA's and that these are practised continuously until they become a "habit".
- All maturity levels focus on the keyword "performance".
- Two and five optional PA's from "Safety" and "Security" purview have been included.
- PCMM process areas have been merged.
Appraisal
[edit]An organization cannot be certified in CMMI; instead, an organization is appraised. Depending on the type of appraisal, the organization can be awarded a maturity level rating (1–5) or a capability level achievement profile.
Many organizations find value in measuring their progress by conducting an appraisal. Appraisals are typically conducted for one or more of the following reasons:
- To determine how well the organization's processes compare to CMMI best practices, and to identify areas where improvement can be made
- To inform external customers and suppliers of how well the organization's processes compare to CMMI best practices
- To meet the contractual requirements of one or more customers
Appraisals of organizations using a CMMI model[12] must conform to the requirements defined in the Appraisal Requirements for CMMI (ARC) document. There are three classes of appraisals, A, B and C, which focus on identifying improvement opportunities and comparing the organization's processes to CMMI best practices. Of these, class A appraisal is the most formal and is the only one that can result in a level rating. Appraisal teams use a CMMI model and ARC-conformant appraisal method to guide their evaluation of the organization and their reporting of conclusions. The appraisal results can then be used (e.g., by a process group) to plan improvements for the organization.
The Standard CMMI Appraisal Method for Process Improvement (SCAMPI) is an appraisal method that meets all of the ARC requirements.[13] Results of a SCAMPI appraisal may be published (if the appraised organization approves) on the CMMI Web site of the SEI: Published SCAMPI Appraisal Results. SCAMPI also supports the conduct of ISO/IEC 15504, also known as SPICE (Software Process Improvement and Capability Determination), assessments etc.
This approach promotes that members of the EPG and PATs be trained in the CMMI, that an informal (SCAMPI C) appraisal be performed, and that process areas be prioritized for improvement. More modern approaches, that involve the deployment of commercially available, CMMI-compliant processes, can significantly reduce the time to achieve compliance. SEI has maintained statistics on the "time to move up" for organizations adopting the earlier Software CMM as well as CMMI.[14] These statistics indicate that, since 1987, the median times to move from Level 1 to Level 2 is 23 months, and from Level 2 to Level 3 is an additional 20 months. Since the release of the CMMI, the median times to move from Level 1 to Level 2 is 5 months, with median movement to Level 3 another 21 months. These statistics are updated and published every six months in a maturity profile.[citation needed]
The Software Engineering Institute's (SEI) team software process methodology and the use of CMMI models can be used to raise the maturity level. A new product called Accelerated Improvement Method[15] (AIM) combines the use of CMMI and the TSP.[16]
Security
[edit]To address user security concerns, two unofficial security guides are available. Considering the Case for Security Content in CMMI for Services has one process area, Security Management.[17] Security by Design with CMMI for Development, Version 1.3 has the following process areas:
- OPSD – Organizational Preparedness for Secure Development
- SMP – Secure Management in Projects
- SRTS – Security Requirements and Technical Solution
- SVV – Security Verification and Validation
While they do not affect maturity or capability levels, these process areas can be reported in appraisal results.[18]
Applications
[edit]The SEI published a study saying 60 organizations measured increases of performance in the categories of cost, schedule, productivity, quality and customer satisfaction.[19] The median increase in performance varied between 14% (customer satisfaction) and 62% (productivity). However, the CMMI model mostly deals with what processes should be implemented, and not so much with how they can be implemented. These results do not guarantee that applying CMMI will increase performance in every organization. A small company with few resources may be less likely to benefit from CMMI; this view is supported by the process maturity profile (page 10). Of the small organizations (<25 employees), 70.5% are assessed at level 2: Managed, while 52.8% of the organizations with 1,001–2,000 employees are rated at the highest level (5: Optimizing).
Turner & Jain (2002) argue that although it is obvious there are large differences between CMMI and agile software development, both approaches have much in common. They believe neither way is the 'right' way to develop software, but that there are phases in a project where one of the two is better suited. They suggest one should combine the different fragments of the methods into a new hybrid method. Sutherland et al. (2007) assert that a combination of Scrum and CMMI brings more adaptability and predictability than either one alone.[20] David J. Anderson (2005) gives hints on how to interpret CMMI in an agile manner.[21]
CMMI Roadmaps,[22] which are a goal-driven approach to selecting and deploying relevant process areas from the CMMI-DEV model, can provide guidance and focus for effective CMMI adoption. There are several CMMI roadmaps for the continuous representation, each with a specific set of improvement goals. Examples are the CMMI Project Roadmap,[23] CMMI Product and Product Integration Roadmaps[24] and the CMMI Process and Measurements Roadmaps.[25] These roadmaps combine the strengths of both the staged and the continuous representations.
The combination of the project management technique earned value management (EVM) with CMMI has been described.[26] To conclude with a similar use of CMMI, Extreme Programming (XP), a software engineering method, has been evaluated with CMM/CMMI (Nawrocki et al., 2002). For example, the XP requirements management approach, which relies on oral communication, was evaluated as not compliant with CMMI.
CMMI can be appraised using two different approaches: staged and continuous. The staged approach yields appraisal results as one of five maturity levels. The continuous approach yields one of four capability levels. The differences in these approaches are felt only in the appraisal; the best practices are equivalent resulting in equivalent process improvement results.
See also
[edit]References
[edit]- ^ "CMMI Content Changes. Release: V3.0, 6 April 2023". CMMI Institute.
- ^ "Trademark Electronic Search System (TESS)". tmsearch.uspto.gov. Archived from the original on 21 December 2016. Retrieved 21 December 2016.
- ^ a b c d Sally Godfrey (2008) [software.gsfc.nasa.gov/docs/What%20is%20CMMI.ppt What is CMMI ?]. NASA presentation. Accessed 8 December 2008.
- ^ "CMMI Institute - Home".
- ^ "CMMI V1.3: Summing up". Ben Linders. 10 January 2011.
- ^ "CMMI V1.3: Agile". Ben Linders. 20 November 2010.
- ^ "CMMI V1.3 Released: High Maturity Clarified". Ben Linders. 2 November 2010.
- ^ "CMMI V1.3: Deploying the CMMI". Ben Linders. 16 November 2010.
- ^ CMMI Overview. Software Engineering Institute. Accessed 16 February 2011.
- ^ "CMMI Institute - Core Practice Areas, Categories, and Capability Areas". Archived from the original on 16 December 2018. Retrieved 15 December 2018.
- ^ "CMMI V1.3 Process Areas". Ben Linders. 18 September 2023.
- ^ For the latest published CMMI appraisal results see the SEI Web site Archived 6 February 2007 at the Wayback Machine.
- ^ "Standard CMMI Appraisal Method for Process Improvement (SCAMPISM) A, Version 1.2: Method Definition Document". CMU/SEI-2006-HB-002. Software Engineering Institute. 2006. Retrieved 23 September 2006.
- ^ "Process Maturity Profile". Retrieved 16 February 2011.
- ^ "SEI Digital Library". resources.sei.cmu.edu. 9 February 2024.
- ^ "TSP Overview". resources.sei.cmu.edu. 13 September 2010.
- ^ Eileer Forrester and Kieran Doyle. Considering the Case for Security Content in CMMI for Services (October 2010)
- ^ Siemens AG Corporate Technology. Security by Design with CMMI for Development, Version 1.3, (May 2013)
- ^ "CMMI Performance Results of CMMI". Retrieved 23 September 2006.
- ^ Sutherland, Jeff; Ruseng Jakobsen, Carsten; Johnson, Kent. "Scrum and CMMI Level 5: The Magic Potion for Code Warriors" (PDF). Object Technology Jeff Sutherland.
- ^ Anderson, D. J. (20 July 2005). "Stretching agile to fit CMMI level 3 - the story of creating MSF for CMMI/spl reg/ process improvement at Microsoft corporation". Agile Development Conference (ADC'05). pp. 193–201. doi:10.1109/ADC.2005.42. ISBN 0-7695-2487-7. S2CID 5675994 – via IEEE Xplore.
- ^ "CMMI Roadmaps". resources.sei.cmu.edu. 31 October 2008.
- ^ "CMMI V1.3: The CMMI Project roadmap". Ben Linders. 7 December 2010.
- ^ "CMMI V1.3: The CMMI Product and Product Integration roadmaps". Ben Linders. 14 December 2010.
- ^ "CMMI V1.3: The CMMI Process and Measurement roadmaps". Ben Linders. 28 December 2010.
- ^ "Using CMMI to Improve Earned Value Management". resources.sei.cmu.edu. 30 September 2002. Retrieved 30 June 2022.
External links
[edit]Capability Maturity Model Integration
View on GrokipediaIntroduction
Overview
The Capability Maturity Model Integration (CMMI) is a proven set of global best practices that drives business performance through building and benchmarking key capabilities.[10] Originally developed by the Software Engineering Institute (SEI) at Carnegie Mellon University for the U.S. Department of Defense,[2] it is now managed by the CMMI Institute, a subsidiary of ISACA.[10] CMMI's primary goals include improving organizational performance, quality, and predictability across product development, service delivery, and acquisition processes.[10] It enables organizations to align operations with business objectives, measure capabilities, and optimize results in diverse domains such as software engineering, systems engineering, services, and supplier management.[10] The framework applies to any industry, offering customized views like Development, Services, Suppliers, People, Data, Safety, Security, and Virtual to address specific needs.[10] CMMI integrates multiple discipline-specific maturity models into a single, flexible framework, providing a unified approach to process improvement without requiring organizations to adopt separate models for different functions.[10] This consolidation facilitates benchmarking against maturity levels that gauge an organization's process sophistication and effectiveness.[10]Key Principles and Objectives
The Capability Maturity Model Integration (CMMI) is grounded in core principles that promote effective process management within organizations. Process standardization serves as a foundational principle, emphasizing the establishment of consistent, repeatable processes to minimize inconsistencies and enhance predictability across projects and operations. This approach draws from established process management practices to ensure that organizations can reliably deliver products and services. Complementing this is measurement-based improvement, which relies on quantitative data collection and analysis to identify performance gaps, track progress, and inform decision-making for iterative enhancements. By integrating metrics into routine operations, organizations can objectively evaluate process effectiveness and drive targeted refinements. A third key principle is alignment with business objectives, which ensures that process improvements are not isolated activities but are strategically linked to an organization's overarching goals, such as cost reduction or quality enhancement, fostering sustainable value creation. The objectives of CMMI focus on building organizational capability in specific domains by implementing proven best practices that elevate performance. A primary aim is to enhance capability in areas like development, acquisition, and services through structured guidance that helps organizations mature their processes from ad hoc to optimized states. This is achieved by reducing process variability, which leads to more predictable outcomes, lower defect rates, and improved resource utilization across initiatives.[11] Furthermore, CMMI supports continuous improvement cycles by encouraging ongoing assessment, feedback loops, and adaptation, enabling organizations to respond dynamically to evolving challenges and opportunities while maintaining alignment with performance targets.[12] CMMI underscores the importance of tailoring practices to fit unique organizational contexts, avoiding a prescriptive one-size-fits-all model that could hinder adoption. This flexibility allows entities to select and adapt relevant elements based on their size, industry, and maturity starting point, promoting practical implementation without compromising core benefits. Performance indicators, such as key performance measures tied to specific goals, play a pivotal role in this framework by providing quantifiable benchmarks that guide maturity progression. Through goal alignment, these indicators ensure that process enhancements directly contribute to business success, such as achieving on-time delivery or customer satisfaction thresholds. For instance, process areas like requirements management illustrate how these principles manifest in practice by linking standardized processes to measurable business outcomes.History and Development
Origins in the Software CMM
The Software Engineering Institute (SEI), established in 1984 by the U.S. Department of Defense (DoD) at Carnegie Mellon University, developed the original Capability Maturity Model (CMM) for software to tackle the escalating software crisis affecting mission-critical defense systems, characterized by frequent delays, cost overruns, and reliability issues.[13] This initiative was spurred by the 1987 Report of the Defense Science Board Task Force on Military Software, which highlighted systemic deficiencies in DoD software acquisition and development processes, recommending a structured framework for assessing and improving contractor capabilities.[14] The SEI's efforts aimed to provide DoD with a reliable method to evaluate software suppliers and promote disciplined process maturation across the defense industry. The Software CMM was first introduced in a preliminary framework in September 1987 through a technical report outlining a maturity questionnaire for assessing organizational processes.[15] It evolved into a formal model with Version 1.0 released in August 1991, which detailed recommended practices for software engineering and management organized into five maturity levels: Initial (ad hoc processes), Repeatable (basic project management), Defined (standardized processes), Managed (measured and controlled), and Optimizing (continuous improvement).[16] Version 1.1, published in February 1993, refined these elements based on community feedback from workshops and assessments, emphasizing key process areas such as requirements management, software design, and quality assurance to guide incremental process improvement.[17] These levels provided a staged progression for organizations to enhance predictability and quality in software development. Despite its impact, the standalone Software CMM revealed limitations when paired with emerging models for other disciplines, such as the Systems Engineering CMM (1994) and the Software Acquisition CMM (1993), resulting in significant redundancy in practices and challenges in coordinating process improvements across integrated project teams.[18] Organizations faced overlapping requirements and inconsistent guidance, complicating efforts to align software development with broader systems engineering and acquisition activities. By the late 1990s, the growing complexity of DoD projects, which increasingly spanned multiple engineering disciplines and required seamless integration of software, hardware, and services, underscored the need for a unified maturity model to eliminate redundancies and provide a cohesive framework for multidisciplinary process enhancement.[18] This transition rationale laid the groundwork for integrating various CMMs into a single, extensible structure, addressing the limitations of siloed approaches amid evolving project demands.Integration and Evolution
The Capability Maturity Model Integration (CMMI) was launched in 2000 with version 1.0, developed by the Software Engineering Institute (SEI) at Carnegie Mellon University to consolidate and replace multiple predecessor models, including the Software CMM (SW-CMM), Systems Engineering CMM (SE-CMM), and Integrated Product Development CMM (IPD-CMM).[2] This integration aimed to create a unified framework that addressed overlapping practices across disciplines, reducing redundancy and enabling organizations to improve processes in a more cohesive manner. Key milestones in CMMI's early evolution included the release of version 1.1 in 2002, which refined the model based on initial user experiences to facilitate broader adoption and clarify implementation guidance.[4] A significant organizational shift occurred in 2016 when the CMMI Institute, which had assumed stewardship from SEI in 2013, was acquired by ISACA, marking a transition to new management focused on global expansion and commercialization of the model.[19][20] This change culminated in the 2018 release of version 2.0 under ISACA's oversight, emphasizing practical application across diverse sectors.[21] The evolution of CMMI has been driven by feedback gathered through thousands of appraisals worldwide, which highlighted needs for simplification and alignment with modern organizational challenges. Industry demands for greater agility, particularly in response to rapid technological changes, have influenced updates to incorporate flexible practices, such as those supporting DevOps methodologies for faster delivery cycles without sacrificing quality.[22] These refinements reflect ongoing input from users and appraisers, ensuring the model remains relevant to contemporary process improvement needs.[23] Overall, CMMI has progressed from discipline-specific models focused on individual engineering domains to a cross-domain approach that integrates development, services, and acquisition processes.[2] Recent versions have shifted emphasis toward measurable performance outcomes, such as improved predictability and customer satisfaction, rather than adherence to prescriptive, rigid procedures, enabling organizations to adapt the framework to agile and outcome-oriented environments.[22]Major Versions
The Capability Maturity Model Integration (CMMI) has evolved through several major versions, each refining the framework to address emerging organizational needs while maintaining core principles of process improvement. Version 1.2, released in August 2006, introduced enhancements such as the CMMI for Services model and addressed inconsistencies in prior versions to improve usability and alignment across process areas.[24] Version 1.3, released in October 2010 by the Software Engineering Institute (SEI) at Carnegie Mellon University, represented a significant update to the CMMI product suite, incorporating models for development, services, and acquisition.[25] This version finalized both staged and continuous representations, allowing organizations to pursue maturity either through predefined levels or targeted capability improvements. It featured 22 process areas organized into categories such as process management, project management, engineering, and support, providing comprehensive best practices for product lifecycle management, service delivery, and supplier sourcing. Version 2.0, introduced in March 2018 by the CMMI Institute, streamlined the model to enhance usability and alignment with contemporary practices like agile development and DevOps.[21] This iteration reduced the content to 20 practice areas, emphasizing outcome-based practices over prescriptive processes to reduce documentation burdens and support faster implementation.[21] It introduced modular "views" for specialized domains, including data management, safety, and security, which could be layered onto core models for development, services, and acquisition without requiring separate appraisals.[21] The focus shifted toward measurable business performance, agility, and scalability, making the model more adaptable to diverse organizational contexts.[21] Version 3.0, released on April 6, 2023, by ISACA (following its acquisition of the CMMI Institute), further integrated digital transformation elements into the core framework.[26] Building on prior versions, it consolidated views into the main model and added three new capability areas—Data, People, and Virtual—to address modern challenges like cybersecurity, workforce resilience, and remote operations.[27] This update enhanced emphasis on measurable business value, risk management, and organizational adaptability, while refining appraisal methods for greater efficiency and relevance in dynamic environments.[22]| Version | Release Date | Key Structural Changes | Process/Practice Areas | Domains and Focus Areas |
|---|---|---|---|---|
| 1.3 | October 2010 | Finalized staged and continuous representations; comprehensive guidelines for integrated processes. | 22 process areas (e.g., project planning, requirements management, process and product quality assurance). | Development, Services, Acquisition; emphasis on product lifecycle and service delivery best practices.[25] |
| 2.0 | March 2018 | Outcome-based restructuring; modular views added; reduced prescriptive elements for agility. | 20 practice areas (e.g., planning, monitoring and controlling, causal analysis and resolution). | Development, Services, Acquisition; added views for Data, Safety, Security; focus on business outcomes and DevOps integration.[21] |
| 3.0 | April 2023 | Core integration of views; new capability areas; updated for digital and hybrid work contexts. | 20+ practice areas with expanded capability levels; consolidated into unified model architecture. | All prior domains plus Data, People, Virtual; enhanced resilience, cybersecurity, and measurable value.[27][26][22] |
Model Fundamentals
Representations: Staged vs. Continuous
The Capability Maturity Model Integration (CMMI) provides two distinct representations for implementing process improvement: the staged representation and the continuous representation. These approaches allow organizations to tailor their improvement strategies based on maturity goals and business needs, with the staged approach emphasizing a structured, organization-wide progression and the continuous approach offering flexibility for targeted enhancements.[28] In the staged representation, organizations advance through a series of predefined maturity levels (0 through 5), where each level builds upon the previous one by requiring the implementation of all associated process areas. Maturity level 0 represents incomplete, ad hoc processes, while level 5 achieves optimizing processes with continuous improvement. Achievement of a maturity level demands that all process areas within that level, as well as all lower levels, are fully satisfied, ensuring a comprehensive foundation before progression. This representation is particularly suited for broad organizational transformation, providing a clear roadmap that aligns improvement efforts across the enterprise and facilitates benchmarking against industry standards.[8] Conversely, the continuous representation focuses on capability levels (0 through 3) applied individually to each process area, enabling organizations to select and improve specific areas without adhering to a fixed sequence. Capability level 0 indicates incomplete processes, level 1 initial achievement of specific and generic practices, level 2 managed processes, and level 3 defined processes. This approach supports incremental improvements by allowing prioritization based on business objectives, such as enhancing a single discipline like project management or supplier agreement processes. It promotes flexibility, making it ideal for organizations seeking discipline-specific advancements or integrating CMMI with other frameworks.[8] The key differences between the representations lie in their scope and flexibility: the staged approach fosters holistic organizational maturity by enforcing a predefined order of process areas, which can streamline communication and resource allocation but may limit customization; in contrast, the continuous approach enables targeted, incremental enhancements that align closely with project-specific or departmental needs, though it requires more sophisticated planning to manage disparate capability levels. Organizations typically select the staged representation for beginners or enterprise-wide initiatives due to its simplicity and proven path, while the continuous representation is preferred by more mature entities or those focusing on specific projects to achieve quicker, focused returns on improvement efforts.[29]Process Areas and Categories
In the Capability Maturity Model Integration (CMMI), process areas (referred to as practice areas in later versions) serve as the core building blocks, defined as clusters of related practices that, when performed collectively, satisfy a set of goals considered essential for achieving significant improvement in a specific aspect of process performance. These areas provide organizations with a structured framework to identify, implement, and institutionalize effective processes tailored to their operational context. In CMMI V3.0 (released April 6, 2023), there are 31 core practice areas, with additional domain-specific areas depending on the model (e.g., 19 for Development), organized into four categories: Managing (planning, execution, and oversight), Delivering (technical and service delivery), Enabling (supporting infrastructure and resources), and Improving (process definition and enhancement). For example, the Managing category includes Estimating (developing estimates) and Monitor and Control (tracking performance); Delivering includes Technical Solution (designing components); Enabling includes Configuration Management (controlling changes); and Improving includes Causal Analysis and Resolution (CAR), a practice area focused on identifying causes of selected outcomes (such as defects) and taking action to prevent their recurrence or occurrence (associated with maturity level 5 in the staged representation). The implementation of CAR can range from superficial (addressing only symptoms or immediate issues without identifying root causes), to reactive (identifying and resolving root causes after a problem has occurred through corrective actions), to proactive (identifying potential causes in advance and implementing preventive actions to avoid problems). These levels illustrate progression in organizational maturity for causal analysis, supporting continuous process improvement. In earlier versions like V1.3, there were 22 process areas in categories such as Process Management, Project Management, Engineering, and Support.[8][30] The V3.0 model introduces new practice areas such as Data Management, Data Quality, and Workforce Empowerment, organized under domains including Data, People, Virtual, Safety, Security, Development, Service, and Supplier Management, enhancing support for modern practices like agile, DevSecOps, and data-driven decision-making.[30] Within each practice area, the structure consists of specific goals and specific practices that directly achieve the area's objectives, alongside generic goals and generic practices that ensure the processes are institutionalized across the organization. Specific goals represent the expected outcomes, supported by specific practices that describe activities to meet those goals, while generic goals and practices—common to all areas—address aspects like planning, monitoring, and organizational alignment to promote repeatability and sustainability. The primary purpose of these practice areas and categories is to offer reusable, modular components that organizations can select and adapt to their unique needs, facilitating targeted process improvement without requiring a one-size-fits-all approach. This modular design supports both staged and continuous representations of the model, allowing flexibility in how areas are prioritized and implemented.Maturity Levels and Capability Levels
The Capability Maturity Model Integration (CMMI) utilizes maturity levels and capability levels as hierarchical frameworks to evaluate and enhance process maturity within organizations. Maturity levels apply to the staged representation, offering a sequential path for overall organizational improvement by grouping related practices into predefined stages. In contrast, capability levels support the continuous representation, enabling focused assessment and advancement of individual practice areas independently. These levels are defined in CMMI Version 3.0, emphasizing progressive institutionalization of processes through specific and generic practices.[8]Maturity Levels (Staged Representation)
Maturity levels range from 0 to 5, with each level building upon the previous to foster predictable, measurable, and continuously improving processes. Progression requires achieving all specific practices in designated process areas at that level, along with generic practices that ensure institutionalization. The following table summarizes the key characteristics of each maturity level:| Level | Name | Description |
|---|---|---|
| 0 | Incomplete | Ad hoc and unknown; work may or may not get completed.[8] |
| 1 | Initial | Unpredictable and reactive; work often delayed and over budget.[8] |
| 2 | Managed | Managed at project level; planned, performed, measured, and controlled.[8] |
| 3 | Defined | Proactive; organization-wide standards guide projects, programs, and portfolios.[8] |
| 4 | Quantitatively Managed | Measured and controlled; data-driven with predictable, quantitative objectives.[8] |
| 5 | Optimizing | Stable and flexible; focused on continuous improvement and agility.[8] |
Capability Levels (Continuous Representation)
Capability levels, ranging from 0 to 3, assess the maturity of individual practice areas rather than the entire organization, allowing flexible, targeted improvements. Each level requires fulfillment of specific practices for the practice area, plus generic practices for institutionalization at that capability. The following table outlines the capability levels:| Level | Name | Description |
|---|---|---|
| 0 | Incomplete | Incomplete approach to meeting the intent of the Practice Area. May or may not be meeting the intent of any practice.[8] |
| 1 | Initial | Initial approach to Practice Area intent. Not a complete set of practices; addresses performance issues.[8] |
| 2 | Managed | Subsumes Level 1 practices. Simple, complete set of practices; monitors project performance objectives.[8] |
| 3 | Defined | Builds on Level 2. Uses organizational standards and assets; focuses on project and organizational objectives.[8] |