Capability Maturity Model

From Wikipedia, the free encyclopedia

The Capability Maturity Model (CMM) is a process capability maturity model which aids in the definition and understanding of an organization's processes.

The CMM was first described in a book Managing the Software Process by Watts Humphrey (pub. Addison Wesley Professional, Massachusetts, 1989), and hence was also known as "Humphrey's CMM". Watts Humphrey based it on the earlier work of Phil Crosby. Active development of this model by the SEI (US Dept. of Defense Software Engineering Institute) began in 1986. The SEI was at Carnegie Mellon University in Pittsburgh.

The CMM was originally intended as a tool for objectively assessing the ability of government contractors' processes to perform a contracted software project. Though it comes from the area of software development, it can be (and has been and still is being) applied as a generally applicable model to assist in understanding the process capability maturity of organizations in diverse areas. For example, software engineering, system engineering, project management, software maintenance, risk management, system acquisition, information technology (IT), personnel management. It has been used extensively for avionics software and government projects around the world.

The CMM has been superseded by a variant - the CMMI (Capability Maturity Model Integration). The old CMM was renamed to Software Engineering CMM (SE-CMM) and organizations accreditations based on SE-CMM expired on 31 December 2007.

Variants of maturity models derived from the CMM have emerged over the years, including, for example, Software Security Engineering CMM SSE-CMM the People Capability Maturity Model and the Software Maintenance Maturity Model S3M.

Maturity models have been internationally standardized as part of ISO 15504.

Contents

[edit] Maturity model

A maturity model can be described as a structured collection of elements that describe certain aspects of maturity in an organization. A maturity model may provide, for example:

  • a place to start
  • the benefit of a community’s prior experiences
  • a common language and a shared vision
  • a framework for prioritizing actions
  • a way to define what improvement means for your organization.

A maturity model can be used as a benchmark for comparison and as an aid to understanding - for example, for comparative assessment of different organizations where there is something in common that can be used as a basis for comparison. In the case of the CMM, for example, the basis for comparison would be the organisations' software development processes.

[edit] Structure of CMM

The CMM involves the following aspects:

  • Maturity Levels: A number of level continuum culminating in the discipline needed to engage in continuous process improvement and optimization.
  • Key Process Areas: A Key Process Area (KPA) identifies a cluster of related activities that, when performed collectively, achieve a set of goals considered important.
  • Goals: The goals of a key process area summarize the states that must exist for that key process area to have been implemented in an effective and lasting way. The extent to which the goals have been accomplished is an indicator of how much capability the organization has established at that maturity level. The goals signify the scope, boundaries, and intent of each key process area.
  • Common Features: Common features include practices that implement and institutionalize a key process area. There are five types of common features: Commitment to Perform, Ability to Perform, Activities Performed, Measurement and Analysis, and Verifying Implementation.
  • Key Practices: The key practices describe the elements of infrastructure and practice that contribute most effectively to the implementation and institutionalization of the KPAs.

[edit] Levels of the CMM

(See also chapter 2 of (March 2002 edition of CMMI from SEI), page 11.)

There are five levels defined along the continuum of the CMM, and, according to the SEI: "Predictability, effectiveness, and control of an organization's software processes are believed to improve as the organization moves up these five levels. While not rigorous, the empirical evidence to date supports this belief."

The levels are:

[edit] Level 1 - Ad hoc (Chaotic)

It is characteristic of processes at this level that they are (typically) undocumented and in a state of dynamic change, tending to be driven in an ad hoc, uncontrolled and reactive manner by users or events. This provides a chaotic or unstable environment for the processes.

Organisational implications:
(a) Because institutional knowledge tends to be scattered (there being limited structured approach to knowledge management) in such environments, not all of the stakeholders or participants in the processes may know or understand all of the components that make up the processes. As a result, process performance in such organizations is likely to be variable (inconsistent) and depend heavily on the institutional knowledge, or the competence, or the heroic efforts of relatively few people or small groups.

(b) Despite the chaos, such organizations manage to produce products and services. However, in doing so, there is significant risk that they will tend to exceed any estimated budgets or schedules for their projects - it being difficult to estimate what a process will do when you do not fully understand the process (what it is that you do) in the first place and cannot therefore control it or manage it effectively.

(c) Due to the lack of structure and formality, organizations at this level, may over-commit, or abandon processes during a crisis, and be unable to repeat past successes. There tends to be limited planning, limited executive commitment or buy-in to projects, and limited acceptance of processes.

[edit] Level 2 - Repeatable

It is characteristic of processes at this level that some processes are repeatable, possibly with consistent results. The processes may not repeat for all the projects in the organization. The organization may use some basic project management to track cost and schedule.

Process discipline is unlikely to be rigorous, but where it exists it may help to ensure that existing practices are retained during times of stress. When these practices are in place, projects are performed and managed according to their documented plans.

Organisational implications:
(a) Project status and the delivery of services are visible to management at defined points - for example, at major milestones and at the completion of major tasks and activities.

(b) Basic project management processes are established to track cost, schedule, and functionality. The minimum process discipline is in place to repeat earlier successes on projects with similar applications and scope. There is still a significant risk of exceeding cost and time estimates.

[edit] Level 3 - Defined

It is characteristic of processes at this level that there are sets of defined and documented standard processes established and subject to some degree of improvement over time. These standard processes are in place (i.e., they are the AS-IS processes) and used to establish consistency of process performance across the organization. Projects establish their defined processes by applying the organization’s set of standard processes, tailored, if necessary, within similarly standardized guidelines.

Organisational implictions:
(a) The organization’s management establishes and mandates process objectives for the organization’s set of standard processes, and ensures that these objectives are appropriately addressed.

[edit] Level 4 - Managed

It is characteristic of processes at this level that, using process metrics, management can effectively control the AS-IS process (e.g., for software development ). In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. Process Capability is established from this level.

Organisational implications:
(a) Quantitative quality goals tend to be set for process output - e.g., software or software maintenance.
(b) Using quantitative/statistical techniques, process performance is measured and monitored, and process performance is thus generally predictable and controllable.

[edit] Level 5 - Optimized

It is characteristic of processes at this level that the focus is on continually improving process performance through both incremental and innovative technological changes/improvements.

Organisational implications:
(a) Quantitative process-improvement objectives for the organization are established, continually revised to reflect changing business objectives, and used as criteria in managing process improvement. Thus, process improvements to address common causes of process variation and measurably improve the organization’s processes are identified, evaluated, and deployed.
(b) The effects of deployed process improvements are measured and evaluated against the quantitative process-improvement objectives.
(c) Both the defined processes and the organization’s set of standard processes are targets for measurable improvement activities.
(d) A critical distinction between maturity level 4 and maturity level 5 is the type of process variation addressed.
At maturity level 4, processes are concerned with addressing statistical special causes of process variation and providing statistical predictability of the results, and though processes may produce predictable results, the results may be insufficient to achieve the established objectives.
At maturity level 5, processes are concerned with addressing statistical common causes of process variation and changing the process (for example, shifting the mean of the process performance) to improve process performance. This would be done at the same time as maintaining the likelihood of achieving the established quantitative process-improvement objectives.

[edit] Extensions

Some versions of CMMI from SEI indicate a "level 0", characterized as "Incomplete". Some pundits leave this level out as redundant or unimportant, but Pressman and others make note of it. See page 18 of the August 2002 edition of CMMI from SEI.[1]

Anthony Finkelstein[2] extrapolated that negative levels are necessary to represent environments that are not only indifferent, but actively counterproductive, and this was refined by Tom Schorsch[3] as the Capability Immaturity Model.

[edit] Key process areas

For more details on this topic, see Process area (CMMI).

The CMMI contains several key process areas indicating the aspects of product development that are to be covered by company processes.

Key Process Areas of the Capability Maturity Model Integration (CMMI)
Abbreviation Name Area Maturity Level
REQM Requirements Management Engineering 2
PMC Project Monitoring and Control Project Management 2
PP Project Planning Project Management 2
SAM Supplier Agreement Management Project Management 2
CM Configuration Management Support 2
MA Measurement and Analysis Support 2
PPQA Process and Product Quality Assurance Support 2
PI Product Integration Engineering 3
RD Requirements Development Engineering 3
TS Technical Solution Engineering 3
VAL Validation Engineering 3
VER Verification Engineering 3
OPD Organizational Process Definition Process Management 3
OPF Organizational Process Focus Process Management 3
OT Organizational Training Process Management 3
IPM Integrated Project Management Project Management 3
ISM Integrated Supplier Management Project Management 3
IT Integrated Teaming Project Management 3
RSKM Risk Management Project Management 3
DAR Decision Analysis and Resolution Support 3
OEI Organizational Environment for Integration Support 3
OPP Organizational Process Performance Process Management 4
QPM Quantitative Project Management Project Management 4
OID Organizational Innovation and Deployment Process Management 5
CAR Causal Analysis and Resolution Support 5

[edit] Software process framework for SEI's Capability Maturity Model

The software process framework documented is intended to guide those wishing to assess an organization/projects consistency with the CMM. For each maturity level there are five checklist types:

TypeSD Description
Policy Describes the policy contents and KPA goals recommended by the CMM.
Standard Describes the recommended content of select work products described in the CMM.
Process Describes the process information content recommended by the CMM. The process checklists are further refined into checklists for:

roles

entry criteria

inputs

activities

outputs

exit criteria

reviews and audits

work products managed and controlled

measurements

documented procedures

training

tools

Procedure Describes the recommended content of documented procedures described in the CMM.
Level Overview Provides an overview of an entire maturity level. The level overview checklists are further refined into checklists for:

KPA purposes (Key Process Areas)

KPA goals

policies

standards

process descriptions

procedures

training

tools

reviews and audits

work products managed and controlled

measurements

[edit] History

The Capability Maturity Model was initially funded through military research - the United States Air Force funded a study at the Carnegie-Mellon Software Engineering Institute to create an abstract model for the military to use as an objective evaluation of software subcontractors. The result was the Capability Maturity Model, published as Managing the Software Process in 1989. The CMM has been superseded by the Capability Maturity Model Integration (CMMI).

[edit] Context

In the 1970s the use of computers became more widespread, flexible and less expensive. Organizations began to adopt computerized information systems, and the demand for software development grew significantly. The processes for software development were in their infancy, with few standard or "best practice" approaches defined.

As a result, the growth was accompanied by growing pains: project failure was common, and the field of computer science was still in its infancy, and the ambitions for project scale and complexity exceeded the market capability to deliver. Individuals such as Edward Yourdon, Larry Constantine, Gerald Weinberg, Tom DeMarco, and David Parnas began to publish articles and books with research results in an attempt to professionalise the software development process.

Watts Humphrey's Capability Maturity Model (CMM) was described in the book Managing the Software Process (1989). The CMM as conceived by Watts Humphrey was based on the work a decade earlier of Phil Crosby who published the Quality Management Maturity Grid in his book Quality is Free in 1979.[4] Active development of the model by the SEI (US Dept. of Defense Software Engineering Institute) began in 1986.

The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. Though it comes from the area of software development, it can be, has been, and continues to be widely applied as a general model of the maturity of processes (e.g., IT Service Management processes) in IS/IT (and other) organizations.

Note that the first application of a staged maturity model to IT was not by CMM/SEI, but rather Richard L. Nolan in 1973.[5]

The model identifies five levels of process maturity for an organization:

  1. Initial (chaotic, ad hoc, heroic) the starting point for use of a new process.
  2. Repeatable (project management, process discipline) the process is used repeatedly.
  3. Defined (institutionalized) the process is defined/confirmed as a standard business process.
  4. Managed (quantified) process management and measurement takes place.
  5. Optimising (process improvement) process management includes deliberate process optimization/improvement.

Within each of these maturity levels are KPAs (Key Process Areas) which characterise that level, and for each KPA there are five definitions identified:

  1. Goals
  2. Commitment
  3. Ability
  4. Measurement
  5. Verification

The KPAs are not necessarily unique to CMM, representing — as they do — the stages that organizations must go through on the way to becoming mature.

Process assessment is best led by an appropriately skilled/competent lead assessor. The organisation's process maturity level is assessed, and then a specific plan is developed to get to the next level. Skipping levels is not allowed.

N.B.: The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. It may be suited for that purpose. When it became a general model for software process improvement, there were many critics.

"Shrinkwrap" companies are also called "COTS" or commercial-off-the-shelf firms or software package firms. They include Claris, Apple, Symantec, Microsoft, and Lotus, amongst others. Many such companies rarely if ever managed their requirements documents as formally as the CMM described in order to achieve level 2, and so all of these companies would probably fall into level 1 of the model.

[edit] Origins

In the 1980s, several military projects involving software subcontractors ran over-budget and were completed much later than planned, if they were completed at all. In an effort to determine why this was occurring, the United States Air Force funded a study at the SEI. The result of this study was a model for the military to use as an objective evaluation of software subcontractors. In 1989, the Capability Maturity Model was published as Managing the Software Process. The basis for the model is the Quality Management Maturity Grid introduced by Philip Crosby in his 1979 book 'Quality is Free'.

Timeline

  • 1987: SEI-87-TR-24 (SW-CMM questionnaire), released.
  • 1989: Managing the Software Process, published.
  • 1990: SW-CMM v0.2, released (first external issue see Paulk handout).
  • 1991: SW-CMM v1.0, released.
  • 1993: SW-CMM v1.1, released.
  • 1997: SW-CMM revisions halted in support for CMMI.

[edit] Current state

Although the CMM model proved useful to many organizations, the use of multiple models has been problematic. Applying multiple models that are not integrated within and across an organization could be costly in terms of training, appraisals, and improvement activities. The CMMI (CMM Integration) project was formed to sort out the problem of using multiple CMMs. (Refer to CMMI for further information.)

[edit] Future direction

With the release of the CMMI Version 1.2 Product Suite, the possibility of multiple CMMI models was created. (Refer to CMMI for further information.)

[edit] Controversial aspects

The software industry is diverse and volatile. All methodologies for creating software have supporters and critics, and the CMM may be no exception.

[edit] Pros

  • Prior to the introduction of Humphrey's CMM, there was no theoretical basis applicable to process maturity for IT-related processes. (In the absence of theory as a basis for action, action is, by definition, irrational.)
  • CMM has been shown to be well-suited for organizations wishing to define their key processes.


[edit] Cons

  • The objective of scientifically managing the software process using defined metrics is difficult to achieve until Level 4. Prior to that level, ABC (Activity-Based Costing) is difficult to apply to validate process cost-savings, except by empirical means.
  • The CMM does not help to define the structure of an effective software development organization. The CMM contains behaviors or best practices that successful projects have demonstrated. Thus, being CMM compliant would not necessarily guarantee that a project would be successful. However, being compliant could increase a project's chances of being successful.
  • Critical analysis of CMM has been published in at least two papers. Bach raises questions about the validity of CMM benchmarks for "good" software development processes. Bollinger and McGowan discuss flaws in the CMM approach where it may use "assembly-line" process models. They suggest that manufacturing is fundamentally different to software development, as the former is primarily concerned with replication and the latter with design.

[edit] CMM Levels 2 and 3 can have beneficial aspects

  • Creation of Software Specifications, stating what is going to be developed, combined with formal sign off, an executive sponsor and approval mechanism. This is NOT a living document, but additions are placed in a deferred or out of scope section for later incorporation into the next cycle of software development.
  • A Technical Specification, stating how precisely the thing specified in the Software Specifications is to be developed will be used. This is a living document.
  • Peer Review of Code (Code Review) with metrics that allow developers to walk through an implementation, and to suggest improvements or changes. (Note - This is problematic because the code has already been developed, and a bad design potentially cannot be fixed by "tweaking".) The Code Review gives complete code a formal approval mechanism.
  • Version Control - a very large number of organizations have no formal revision control mechanism or release mechanism in place.
  • The idea that there is a "right way" to build software, that it is a scientific process involving engineering design and that groups of developers are not there to simply work on ad hoc problems.


[edit] See also

[edit] References

Books
  • Chrissis, Mary Beth; Konrad, Mike, Shrum, Sandy (2003). CMMI : Guidelines for Process Integration and Product Improvement. Addison-Wesley Professional. ISBN 0-321-15496-7. 
  • Kulpa, Margaret K.; Kent A. Johnson (2003). Interpreting the CMMI : A Process Improvement Approach. Auerbach Publications. ISBN 0-8493-1654-5. 
Papers
  • McGowan, Clement (1991). "A Critical Look at Software Capability Evaluations". 25-41. IEEE Software
Websites

[edit] Footnotes

  1. ^ August 2002 edition of CMMI (PDF). CMU/SEI-2002-TR-011. SEI (2002). Retrieved on 2006-10-28.
  2. ^ Finkelstein, Anthony (2000-04-28). A Software Process Immaturity Model (PDF). University College London. Retrieved on 2006-10-28.
  3. ^ Schorsch, Tom (1996). The Capability Im-Maturity Model. The Air Force Institute of Technology. Retrieved on 2006-10-28.
  4. ^ Crosby, Philip (1979). Quality is Free. Mc Graw Hill. ISBN 0451622472. 
  5. ^ Nolan, Richard (July, 1973). "Managing the computer resource: a stage hypothesis" (399 - 405). Communications of the ACM 16 (7). Association for Computing Machinery. 

[edit] External links