Capability Maturity Model

From Wikipedia, the free encyclopedia

Capability Maturity Model (CMM) broadly refers to a process improvement approach that is based on a process model. CMM also refers specifically to the first such model, developed by the Software Engineering Institute (SEI) in the mid-1980s, as well as the family of process models that followed. A process model is a structured collection of practices that describe the characteristics of effective processes; the practices included are those proven by experience to be effective. [1]

The Capability Maturity Model can be used to assess an organization against a scale of five process maturity levels. Each level ranks the organization according to its standardization of processes in the subject area being assessed. The subject areas can be as diverse as software engineering, systems engineering, project management, risk management, system acquisition, information technology (IT) services and personnel management.

CMM was developed by the SEI at Carnegie Mellon University in Pittsburgh. It has been used extensively for avionics software and government projects, in North America, Europe, Asia, Australia, South America, and Africa. [2] Currently, some government departments require software development contract organization to achieve and operate at a level 3 standard.

Contents

[edit] Maturity model

The Capability Maturity Model (CMM) is a way to develop and refine an organization's processes. The first CMM was for the purpose of developing and refining software development processes. A maturity model is a structured collection of elements that describe characteristics of effective processes. A maturity model provides:

  • a place to start
  • the benefit of a community’s prior experiences
  • a common language and a shared vision
  • a framework for prioritizing actions
  • a way to define what improvement means for your organization

A maturity model can be used as a benchmark for assessing different organizations for equivalent comparison. It describes the maturity of the company based upon the project the company is dealing with and the clients.

[edit] History

The Capability Maturity Model was initially funded by military research. The United States Air Force funded a study at the Carnegie-Mellon Software Engineering Institute to create a model (abstract) for the military to use as an objective evaluation of software subcontractors. The result was the Capability Maturity Model, published as Managing the Software Process in 1989. The CMM is no longer supported by the SEI and has been superseded by the more comprehensive Capability Maturity Model Integration (CMMI), of which version 1.2 has now been released.

[edit] Context

In the 1970s, technological improvements made computers more widespread, flexible, and inexpensive. Organizations began to adopt more and more computerized information systems and the field of software development grew significantly. This led to an increased demand for developers—and managers—which was satisfied with less experienced professionals.

Unfortunately, the influx of growth caused growing pains; project failure became more commonplace not only because the field of computer science was still in its infancy, but also because projects became more ambitious in scale and complexity. In response, individuals such as Edward Yourdon, Larry Constantine, Gerald Weinberg, Tom DeMarco, and David Parnas published articles and books with research results in an attempt to professionalize the software development process.

Watts Humphrey's Capability Maturity Model (CMM) was described in the book Managing the Software Process (1989). The CMM as conceived by Watts Humphrey was based on the work a decade earlier of Phil Crosby who published the Quality Management Maturity Grid in his book Quality is Free in 1979.[3][4] Active development of the model by the SEI (US Dept. of Defense Software Engineering Institute) began in 1986.

The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. Though it comes from the area of software development, it can be, has been, and continues to be widely applied as a general model of the maturity of processes (e.g., IT Service Management processes) in IS/IT (and other) organizations.

The model identifies five levels of process maturity for an organisation:

  1. Initial (chaotic, ad hoc, heroic) the starting point for use of a new process.
  2. Repeatable (project management, process discipline) the process is used repeatedly.
  3. Defined (institutionalized) the process is defined/confirmed as a standard business process.
  4. Managed (quantified) process management and measurement takes place.
  5. Optimising (process improvement) process management includes deliberate process optimization/improvement.

Within each of these maturity levels are KPAs (Key Process Areas) which characterise that level, and for each KPA there are five definitions identified:

  1. Goals
  2. Commitment
  3. Ability
  4. Measurement
  5. Verification

The KPAs are not necessarily unique to CMM, representing - as they do - the stages that organizations must go through on the way to becoming mature.

The assessment is supposed to be led by an authorised lead assessor. One way in which companies are supposed to use the model is first to assess their maturity level and then form a specific plan to get to the next level. Skipping levels is not allowed.

NB: The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. It may be suited for that purpose. When it became a general model for software process improvement, there were many critics.

"Shrinkwrap" companies are also called "COTS" or commercial-off-the-shelf firms or software package firms. They include Claris, Apple, Symantec, Microsoft, and Lotus, amongst others. Many such companies rarely if ever managed their requirements documents as formally as the CMM described in order to achieve level 2, and so all of these companies would probably fall into level 1 of the model.

[edit] Origins

The United States Air Force funded a study at the SEI to create a model for the military to use as an objective evaluation of software subcontractors. In 1989, the Capability Maturity Model was published as Managing the Software Process.

Timeline

  • 1987: SEI-87-TR-24 (SW-CMM questionnaire), released.
  • 1989: Managing the Software Process, published.
  • 1990: SW-CMM v0.2, released (first external issue see Paulk handout).
  • 1991: SW-CMM v1.0, released.
  • 1993: SW-CMM v1.1, released.
  • 1997: SW-CMM revisions halted in support for CMMI.
  • 2000: CMMI v1.02, released.
  • 2002: CMMI v1.1, released.
  • 2006: CMMI v1.2, released.

[edit] Current state

Although these models have proved useful to many organizations, the use of multiple models has been problematic. Further, applying multiple models that are not integrated within and across an organization is costly in terms of training, appraisals, and improvement activities. The CMM Integration project was formed to sort out the problem of using multiple CMMs. The CMMI Product Team's mission was to combine three source models:

  1. The Capability Maturity Model for Software (SW-CMM) v2.0 draft C
  2. The Systems Engineering Capability Model (SECM)
  3. The Integrated Product Development Capability Maturity Model (IPD-CMM) v0.98
  4. Supplier sourcing

CMMI is the designated successor of the three source models. The SEI has released a policy to sunset the Software CMM and previous versions of the CMMI. [5] The same can be said for the SECM and the IPD-CMM; these models were superseded by CMMI.

[edit] Future direction

With the release of the CMMI Version 1.2 Product Suite, the existing CMMI has been renamed the CMMI for Development (CMMI-DEV), V1.2.[1] A version of the CMMI for Services is being developed by a Northrop Grumman-led team under the auspices of the SEI, with participation from Boeing, Lockheed Martin, Raytheon, SAIC, SRA, and Systems and Software Consortium (SSCI).[2] A CMMI for Acquisition (CMMI-ACQ) is also under development at the SEI.[3]

Suggestions for improving CMMI are welcomed by the SEI. For information on how to provide feedback, see the CMMI Web site.

In some cases, CMM can be combined with other methodologies. It is commonly used in conjunction with the ISO 9001 standard. JPMorgan Chase & Co. tried combining CMM with the computer programming methodologies of Extreme Programming (XP), and Six Sigma. They found that the three systems reinforced each other well, leading to better development, and did not mutually contradict, see Extreme Programming (XP) Six Sigma CMMI.

[edit] Levels of the CMM

(See chapter 2 of (March 2002 edition of CMMI from SEI), page 11.)

There are five levels of the CMM. According to the SEI,

"Predictability, effectiveness, and control of an organization's software processes are believed to improve as the organization moves up these five levels. While not rigorous, the empirical evidence to date supports this belief."

[edit] Level 1 - Initial

At maturity level 1, processes are usually ad hoc and the organization usually does not provide a stable environment. Success in these organizations depends on the competence and heroics of the people in the organization and not on the use of proven processes. In spite of this ad hoc, chaotic environment, maturity level 1 organizations often produce products and services that work; however, they frequently exceed the budget and schedule of their projects.

Maturity level 1 organizations are characterized by a tendency to over commit, abandon processes in the time of crisis, and not be able to repeat their past successes again.

Level 1 software project success depends on having high quality people.

[edit] Level 2 - Repeatable

At maturity level 2, software development successes are repeatable. The processes may not repeat for all the projects in the organization. The organization may use some basic project management to track cost and schedule.

Process discipline helps ensure that existing practices are retained during times of stress. When these practices are in place, projects are performed and managed according to their documented plans.

Project status and the delivery of services are visible to management at defined points (for example, at major milestones and at the completion of major tasks).

Basic project management processes are established to track cost, schedule, and functionality. The minimum process discipline is in place to repeat earlier successes on projects with similar applications and scope. There is still a significant risk of exceeding cost and time estimates.

[edit] Level 3 - Defined

The organization’s set of standard processes, which is the basis for level 3, is established and improved over time. These standard processes are used to establish consistency across the organization. Projects establish their defined processes by the organization’s set of standard processes according to tailoring guidelines.

The organization’s management establishes process objectives based on the organization’s set of standard processes and ensures that these objectives are appropriately addressed.

A critical distinction between level 2 and level 3 is the scope of standards, process descriptions, and procedures. At level 2, the standards, process descriptions, and procedures may be quite different in each specific instance of the process (for example, on a particular project). At level 3, the standards, process descriptions, and procedures for a project are tailored from the organization’s set of standard processes to suit a particular project or organizational unit.

[edit] Level 4 - Managed

Using precise measurements, management can effectively control the software development effort. In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. Organizations at this level set quantitative quality goals for both software process and software maintenance.

Subprocesses are selected that significantly contribute to overall process performance. These selected subprocesses are controlled using statistical and other quantitative techniques.

A critical distinction between maturity level 3 and maturity level 4 is the predictability of process performance. At maturity level 4, the performance of processes is controlled using statistical and other quantitative techniques, and is quantitatively predictable. At maturity level 3, processes are only qualitatively predictable.

[edit] Level 5 - Optimizing

Maturity level 5 focuses on continually improving process performance through both incremental and innovative technological improvements. Quantitative process-improvement objectives for the organization are established, continually revised to reflect changing business objectives, and used as criteria in managing process improvement. The effects of deployed process improvements are measured and evaluated against the quantitative process-improvement objectives. Both the defined processes and the organization’s set of standard processes are targets of measurable improvement activities.

Process improvements to address common causes of process variation and measurably improve the organization’s processes are identified, evaluated, and deployed.

Optimizing processes that are nimble, adaptable and innovative depends on the participation of an empowered workforce aligned with the business values and objectives of the organization. The organization’s ability to rapidly respond to changes and opportunities is enhanced by finding ways to accelerate and share learning.

A critical distinction between maturity level 4 and maturity level 5 is the type of process variation addressed. At maturity level 4, processes are concerned with addressing special causes of process variation and providing statistical predictability of the results. Though processes may produce predictable results, the results may be insufficient to achieve the established objectives. At maturity level 5, processes are concerned with addressing common causes of process variation and changing the process (that is, shifting the mean of the process performance) to improve process performance (while maintaining statistical probability) to achieve the established quantitative process-improvement objectives.

[edit] Extensions

Recent versions of CMMI from SEI indicate a "level 0", characterized as "Incomplete". Many observers leave this level out as redundant or unimportant, but Pressman and others make note of it. See page 18 of the August 2002 edition of CMMI from SEI. [6]

Anthony Finkelstein [7] extrapolated that negative levels are necessary to represent environments that are not only indifferent, but actively counterproductive, and this was refined by Tom Schorsch [8] as the Capability Immaturity Model:

[edit] Process areas

For more details on this topic, see Process area (CMMI).

The CMMI contains several key process areas indicating the aspects of product development that are to be covered by company processes.

Key Process Areas of the Capability Maturity Model Integration (CMMI)
Abbreviation Name Area Maturity Level
CAR Causal Analysis and Resolution Support 5
CM Configuration Management Support 2
DAR Decision Analysis and Resolution Support 3
IPM Integrated Project Management Project Management 3
ISM Integrated Supplier Management Project Management 3
IT Integrated Teaming Project Management 3
MA Measurement and Analysis Support 2
OEI Organizational Environment for Integration Support 3
OID Organizational Innovation and Deployment Process Management 5
OPD Organizational Process Definition Process Management 3
OPF Organizational Process Focus Process Management 3
OPP Organizational Process Performance Process Management 4
OT Organizational Training Process Management 3
PI Product Integration Engineering 3
PMC Project Monitoring and Control Project Management 2
PP Project Planning Project Management 2
PPQA Process and Product Quality Assurance Support 2
QPM Quantitative Project Management Project Management 4
RD Requirements Development Engineering 3
REQM Requirements Management Engineering 2
RSKM Risk Management Project Management 3
SAM Supplier Agreement Management Project Management 2
TS Technical Solution Engineering 3
VAL Validation Engineering 3
VER Verification Engineering 3

[edit] Controversial aspects

The software industry is diverse and volatile. All methodologies for creating software have supporters and critics, and the CMM is no exception.

[edit] Praise

  • The CMM was developed to give Defense organizations a yardstick to assess and describe the capability of software contractors to provide software on time, within budget, and to acceptable standards. It has arguably been successful in this role, even reputedly causing some software sales people to clamour for their organizations' software engineers/developers to "implement CMM."
  • The CMM is intended to enable an assessment of an organization's maturity for software development. It is an important tool for outsourcing and exporting software development work. Economic development agencies in India, Ireland, Egypt, Syria, and elsewhere have praised the CMM for enabling them to be able to compete for US outsourcing contracts on an even footing.
  • The CMM provides a good framework for organizational improvement. It allows companies to prioritize their process improvement initiatives.

[edit] Criticism

  • CMM has failed to take over the world. It's hard to tell exactly how wide spread it is as the SEI only publishes the names and achieved levels of compliance of companies that have requested this information to be listed[4]. The most current Maturity Profile for CMMI is available online[5].
  • CMM is well suited for bureaucratic organizations such as government agencies, large corporations and regulated monopolies. If the organizations deploying CMM are large enough, they may employ a team of CMM auditors reporting their results directly to the executive level. (A practice encouraged by SEI.) The use of auditors and executive reports may influence the entire IT organization to focus on perfectly completed forms rather than application development, client needs or the marketplace. If the project is driven by a due date, CMMs intensive reliance on process and forms may become a hindrance to meeting the due date in cases where time to market with some kind of product is more important than achieving high quality and functionality of the product.
  • Suggestions of scientifically managing the software process with metrics only occur beyond the Fourth level. There is little validation of the processes cost savings to business other than a vague reference to empirical evidence. It is expected that a large body of evidence would show that adding all the business overhead demanded by CMM somehow reduces IT headcount, business cost, and time to market without sacrificing client needs.
  • No external body actually certifies a software development center as being CMM compliant. It is supposed to be an honest self-assessment ([6] and [7]). Some organizations misrepresent the scope of their CMM compliance to suggest that it applies to their entire organization rather than a specific project or business unit.
  • The CMM does not describe how to create an effective software development organization. The CMM contains behaviors or best practices that successful projects have demonstrated. Being CMM compliant is not a guarantee that a project will be successful, however being compliant can increase a project's chances of being successful.
  • The CMM can seem to be overly bureaucratic, promoting process over substance. For example, for emphasizing predictability over service provided to end users. More commercially successful methodologies (for example, the Rational Unified Process) have focused not on the capability of the organization to produce software to satisfy some other organization or a collectively-produced specification, but on the capability of organizations to satisfy specific end user "use cases" as per the Object Management Group's UML (Unified Modeling Language) approach[8].

[edit] The most beneficial elements of CMM Level 2 and 3

  • Creation of Software Specifications, stating what is going to be developed, combined with formal sign off, an executive sponsor and approval mechanism. This is NOT a living document, but additions are placed in a deferred or out of scope section for later incorporation into the next cycle of software development.
  • A Technical Specification, stating how precisely the thing specified in the Software Specifications is to be developed will be used. This is a living document.
  • Peer Review of Code (Code Review) with metrics that allow developers to walk through an implementation, and to suggest improvements or changes. Note - This is problematic because the code has already been developed and a bad design can not be fixed by "tweaking", the Code Review gives complete code a formal approval mechanism.
  • Version Control - a very large number of organizations have no formal revision control mechanism or release mechanism in place.
  • The idea that there is a "right way" to build software, that it is a scientific process involving engineering design and that groups of developers are not there to simply work on the problem du jour.

[edit] Companies appraised against the CMMI

Large numbers of IT companies across the world are making forays up the CMMI level ladder. In June, 1999 Wipro of India became the first software services company in the world to attain SEI CMM maturity level 5, the highest maturity level. Every year many IT companies in the world enter into the CMMI regime or improve their CMMI Levels. As of 2006, about 75% of the CMMI level 5 software centers are in India. Most of them are located in the city of Bangalore.[citation needed]

For a complete list view the published SCAMPI results.

[edit] See also

One must be very skeptical about a company claiming that they have obtained a certain level (the higher level, the more skeptical to be) of CMM at an "enterprise level." Usually this is used as a marketing technique that may indeed apply to some project done by the company at some time, but most unlikely achieved by the enterprise.[citations needed]

[edit] References

Books
  • Chrissis, Mary Beth, Konrad, Mike, Shrum, Sandy (2003). CMMI : Guidelines for Process Integration and Product Improvement. Addison-Wesley Professional. ISBN 0-321-15496-7.
  • Kulpa, Margaret K., Kent A. Johnson (2003). Interpreting the CMMI : A Process Improvement Approach. Auerbach Publications. ISBN 0-849-31654-5.
Websites

[edit] Footnotes

  1. ^ Capability Maturity Model®Integration (CMMI®) Version 1.2 Overview (PDF). SEI (2006). Retrieved on 2006-10-28.
  2. ^ What is CMMI? - Worldwide Adoption. SEI (2006). Retrieved on 2006-10-28.
  3. ^ Crosby, Philip (1979). Quality is Free. Mc Graw Hill. ASIN B000K2M9MU.
  4. ^ Crosby, Philip (1980). Quality is Free (paperback). Mc Graw Hill. ISBN 0-451-62585-4.
  5. ^ Sunsetting Version 1.1 of the CMMI® Product Suite. SEI (2006). Retrieved on 2006-10-28.
  6. ^ August 2002 edition of CMMI (PDF). CMU/SEI-2002-TR-011. SEI (2002). Retrieved on 2006-10-28.
  7. ^ Finkelstein, Anthony (2000-04-28). A Software Process Immaturity Model (PDF). University College London. Retrieved on 2006-10-28.
  8. ^ Schorsch, Tom (1996). The Capability Im-Maturity Model. The Air Force Institute of Technology. Retrieved on 2006-10-28.

[edit] External links