Project maturity model

a detailed assessment instrument

This article is copyrighted material and has been reproduced with the permission of Project Management Institute, Inc. Unauthorized reproduction of this material is strictly prohibited.


The past year has seen tremendous interest in the integration of PMI’s A Guide to the Project Management Body of Knowledge (PMBOK®) with Software Engineering Institute's (SEI) Capability Maturity Model® (CMM®). Most items published on this topic fail to look much further than the 5 maturity levels of the CMM. Just as the PMBOK has extensive detail (knowledge areas, processes) below the 5 project management process groups, the CMM contains an in-depth set of maturity criteria organized into 19 key process areas; each of which is further segmented into 5 common features supported by a number of key practices for each of the common features.

Over the past 5 years, the author has created a Project Maturity Assessment Instrument based on the layout of the SEI CMM. While staying true to the CMM’s conceptual framework for the 5 maturity levels, he has replaced the software-specific key process areas with 20 project management-oriented process areas drawn from the PMBOK and from trends in enterprise project management. Likewise, the CMM key practices have been reworked to be applicable to project management best practices.

Most project maturity models concentrate strictly on organizational effectiveness. The author's model differs by presenting process areas and key practices that can be used to assess individual projects as well as a organizations.

The instrument covers the 20 process areas with a 100+ page assessment checklist supported by a spreadsheet for summarizing the findings in both tabular and graphic form. Recognizing that steps to project and organizational maturity are not all-or-nothing, the instrument includes mechanisms for awarding partial credit for items initiated but not yet fully implemented - analogous to an earned-value approach to assessing project progress.

Because of space limitations, the entire instrument cannot be included with this paper. It is available at no charge from the author, who can be contacted by mail at 997 McGilchrist SE, Salem, Oregon 97302, USA. The author will spend much of his session time presenting the detailed instrument rather than the conceptual model described below.

SEI’s Capability Maturity Model

In work sponsored by the U.S. Dept. of Defense, the Software Engineering Institute at Carnegie-Mellon University developed a process maturity framework for software development processes in 1986 with assistance from the Mitre Corporation. The framework was derived from Philip Crosby's quality management maturity grid. The initial release of CMM was reviewed and used by the software community between 1991 and 1992; and the first formal publication of the model was in 1995 (Paulk et al., 1995).

The purpose of the CMM is to provide guidance to organizations that want to gain control of software development and maintenance processes, evolve toward a culture of software engineering, and identify the most critical issues for software quality and process improvement. The author believes that similar goals face many organizations that want to improve their processes for managing projects.

Framework of the CMM

SEI recognized that process maturity, or the lack thereof, could generally be recognized by the formality of process documentation, consistent adherence to the standardized processes, and the existence of continuous review and improvement of those processes and their resultant products. Their framework for measuring the progress of an organization from the immature stage to the mature stage utilizes the following concepts:

  • Maturity levels - indicating general process capability
  • Key process areas - indicating the achievement of specific goals
  • Common features - addressing implementation or institutionalization issues
  • Key practices - describing required infrastructure or activities to address the issues and meet the goals

SEI defined 5 maturity levels, and it is these that are most commonly cited (for instance, Pennypacker & Grant, 2003) and reproduced in the literature:

  • Initial - ad hoc, few defined processes
  • Repeatable - disciplined process, basic project management
  • Defined - standard, consistent management and engineering processes are used throughout the organization
  • Managed - software process and product quality are quantitatively measured and controlled
  • Optimizing - continuous process improvement

The application of these maturity levels to the discipline of project management is further described in Exhibit 1, which also identifies the SEI CMM key process areas and the author's Project Maturity Model (PMM) key process areas.

CMM and PMM Key Process Areas

Exhibit 1. CMM and PMM Key Process Areas

Common features in all of the key process areas include

  • Goals
  • Commitment to Perform
  • Ability to Perform
  • Activities (Tasks) Performed
  • Measurement and Analysis
  • Verifying Implementation

Key Practices

The key practices make up the bulk of the CMM, and take hundreds of pages to describe. But certain themes are common to all the key practices:

  • Written policies and documentation
  • Adequate resources and funding
  • Adequate training and orientation
  • Adherence to policies and processes
  • Measurements for status and effectiveness
  • Periodic review by senior management
  • Periodic and event-driven project manager review
  • QA or independent reviews

Exhibit 2 presents an example of the CMM hierarchical framework, showing the style used for the key practices. Generally, the CMM key practices specify what, not how.

CMM Key Practices Example

Exhibit 2. CMM Key Practices Example

The Project Maturity Model (PMM)

The CMM and the Project Management Body of Knowledge (PMBOK) (Project Management Institute, 2000) share a number of process-related concepts, but also contain unique features as shown in Exhibit 3. The Project Maturity Model integrates all of these concepts in the general context of generic project management. The source of terminology used for overlapping practices is shown in parentheses.

PMBOK and CMM Processes

Exhibit 3. PMBOK and CMM Processes

The framework of the CMM serves equally well as the framework for a general-purpose PMM. The author adapted CMM Level 2 and Level 3 key process areas and key practices for general use on a single project or on a portfolio of projects by removing the specific software development context in the CMM. The key process areas were augmented with knowledge areas from the PMBOK. Level 4 and Level 5 key process areas were reconstructed to cover the organization's holistic approach to project management and solution delivery. These levels are primarily concerned with organizational or program-wide processes rather than individual project processes. They derive from best practices related to enterprise project management. The resultant PMM key process areas are compared to the SEI CMM key process areas in Exhibit 1.

In the PMM, the common feature approach of the CMM has been retained, but the key practices have been extensively revised to

  • remove software-industry specificity
  • make the practices applicable to a wide range of projects
  • bring them in line with PMBOK intent and content
  • encompass advanced practices related to project management

In areas where the PMBOK contains well-established models (i.e., the earned-value approach to project control), the author has incorporated those, which deviates from the CMM practice of specifying only general key practices. To this end, the author has also incorporated elements of the Industry Earned Value Management System Criteria model (Fleming & Koppelman, 2000, pp. 157-188).

The PMM Assessment Instrument

To implement the PMM, the author has created a detailed assessment instrument. The instrument can be used as the basis for either a structured interview or a content review of written plans, procedures, and artifacts supplied by the project, program, or company. The instrument, or sections of it, should be provided to the project/program/company team in advance so that they can prepare their responses and locate the required written evidence. Space limits prevent including the entire instrument in this paper, but its content and layout are illustrated in Appendices 1 and 2. The complete instrument will be available on CD in limited quantities at the Congress presentation. It is also available by writing to the author at the address given in the Introduction.

Word Processor Version

The instrument was first implemented in 1999 as a word processing document (see Appendix 1). The left-most column of the evaluation instrument lists a set of key practices that the assessor must evaluate to form an opinion about the effectiveness of the implementation of one of the PMM key process areas.

In the Findings/Evidence/Documentation column, the assessor describes his or her findings and the source(s) of information used to support the evaluation of each key practice. If a particular practice is not applicable to the project, the assessor checks Not Applicable in the Evaluation Rating/Points column and describes why the practice doesn't apply.

The original SEI CMM process evaluated any particular key practice only as pass/fail. Maturity level certifications were granted on preponderance of evidence, or absence of significant deficiencies, at the maturity level. However, in most social and organizational research, a more granular evaluation scheme allows more flexibility for categorizing the extent of implementation.

In the Evaluation Rating/Points column for each diagnostic, the assessor checks the appropriate box based on the findings of his or her investigation and the written documentation provided by the project team, program manager, or company management. The possible ratings are:

  • Outstanding - the key practice is implemented to a degree not commonly found. Synonyms are best-of-breed, superior, exemplary, world-class.
  • Acceptable - the key practice is complete, thoroughly planned, appropriately documented, and consistently practiced. Synonyms are good, prepared, ready, compliant, done, complete, in practice, implemented.
  • Insufficient - the key practice is sporadically performed, or there is little process documentation or few artifacts. Synonyms are sometimes, informal, unwritten, partial, in process, scheduled.
  • Poor - the key practice has never been or is not being meaningfully planned or performed. Synonyms are none, nonexistent, mediocre, inappropriate, inadequate, impractical, ineffective, irrelevant, worthless.

A summary report card presents the high-level results of the assessment. To account for items that may not be applicable to a particular project, results are normalized to allow comparison among projects. The author found it necessary to assign a point value to each evaluation rating in order to compute a quantitative summary maturity index at the key process area level.

For each process area, the assessor counts the number of each type of rating (ignoring “Not Applicable”) awarded. Summing these counts gives the total number of questions that were evaluated. The normalized score is computed by the formula: [ (# of Outstanding times 100) + (# of Acceptable times 80) + (# of Insufficient times 50) + (# of Poor times 0) ] divided by the total number of applicable questions. Acceptably managed projects have a total project score of 80 or better for the maturity Level 2 and 3 key process areas.

The Spreadsheet Version

After experimenting with the initial version of the instrument, the author developed an even more granular approach to the evaluation rating, assigning a possible range of points to each qualitative evaluation rating. Although this introduced more subjectivity in the choice of a specific number of points for each key practice, it also allowed:

  • improved ability to show incremental process improvement
  • easier computation of an index value for the common feature, key process area, and maturity level
  • algorithmic implementation of a weighting factor for the common features and their associated key practices when computing a total score for a key process area
  • use of spreadsheets for easy recording and charting of assessment results
  • easy representation of longitudinal studies of process maturity growth (or backsliding) at all levels of detail

Appendix 2 contains an excerpt from the spreadsheet workbook created by the author to implement this more quantitative approach to the presentation of assessment findings. However, for all its quantitative strengths, the spreadsheet does not provide the capability to document findings to the extent that the word processing version does. Therefore, the word processing version was customized to include a place to assign points as well as to check an evaluation rating box. The points can then be easily transferred to the spreadsheet version for quantitative analysis and visual displays.

Future of the PMM Assessment Instrument

The content of the instrument is still evolving, as is its implementation technology. In the short run, the author would like to obtain comments on the comprehensiveness of the key practices and to transfer the technology to a database management system that will combine the textual and visual features of the two versions of the current instrument. In the long run, the author hopes that the current instrument will contribute to a PMI-wide effort to produce a generic project maturity model and an associated organizational certification process, similar to that implemented by SEI.

Fleming, Q.W. & Koppelman, J.M. (2000) Earned Value Project Management. Newtown Square, PA: Project Management Institute.

Paulk, M.C.; Weber, C.V.; Curtis, B. & Chrissis, M.B. (1995) The Capability Maturity Model: Guidelines for Improving the Software Process. Reading, MA: Addison-Wesley Publishing Company.

Pennypacker, J.S. & Grant, K.P. (2003) Project Management Maturity: An Industry Benchmark. Project Management Journal, 34(1), 7.

Project Management Institute (2000) A Guide to the Project Management Body of Knowledge (PMBOK® Guide) 2000 Edition v1.2 CD-ROM. Newtown Square, PA: Project Management Institute.

Appendix 1. PMM Evaluation Form - Word Processing Version Excerpt

P&E - Project Planning and Estimation

P&E-G-01. Solution estimates are documented for use in planning and tracking the project.
P&E-G-02. Project activities and commitments are planned and documented.
P&E-G-03. Affected groups and individuals agree to their commitments related to the project.
1. P&E Project Planning and
Findings/ Evidence/
Rating/ Points
Commitment to Perform
P&E-C-01. A project manager and a technical lead are designated to be responsible for negotiating commitments and developing the project management plan and the solution delivery plan, respectively. Project manager is doubling as technical lead, but is unfamiliar with the specific technology. Project team has not been assembled, so no technical help available to the PM.    Outstanding
√ Insufficient Poor
   Not applicable
Points: 50
P&E-C-02. The project follows a written company or client policy for planning a project. Company and client policies are very different. Company is currently deciding whether to just follow client policy or negotiate a relaxation of client policy. Evaluate 3 months from now.    Outstanding
√ Not applicable
Ability to Perform
P&E-A-01. A documented and approved statement of work exists for the project. The client imposed a detailed format for the statement of work. PM has incorporated a signed copy of the SOW into the project management plan and the solution delivery plan. √ Outstanding
   Not applicable
Points: 100
P&E-A-02. ...
P&E-A-03. Adequate resources and funding are provided for planning and controlling the project. A part-time role for a project controller has been created to assist the PM. There is specific budget earmarked in the plan for this function.    Outstanding
√ Acceptable
   Not applicable
Points: 80
Activities (Tasks) Performed
P&E-T-01. ...
P&E-T-02. A solution life cycle with predefined stages of manageable size is identified or defined. PM has adopted a textbook version of a commercial methodology, and has structured a WBS along the lines suggested by the methodology.    Outstanding
√ Acceptable
   Not applicable
Points: 80
P&E-T-03. A solution delivery plan in keeping with the selected life cycle approach is developed according to a documented procedure. The solution delivery plan has not yet been developed to implement the SOW.    Outstanding
√ Poor
   Not applicable
Points: 0

Appendix 2. Sample PMM Evaluation Form - Spreadsheet Version Excerpts

  Assessment Date: 1/31/99
  P&E Total Score: 36
Commitment to Perform (weighting factor = 1) 50
P&E-C-01 A project manager and a technical lead are designated to be responsible for negotiating commitments and developing the project management plan and the solution delivery plan, respectively. 50
P&E-C-02 The project follows a written company or client policy for planning a project.  
Ability to Perform (weighting factor = 1) 70
P&E-A-01 A documented and approved statement of work exists for the project. 100
P&E-A-02 Responsibilities for developing the solution development plan and the project management plan are assigned. 30
P&E-A-03 Adequate resources and funding are provided for planning and controlling the project. 80
Activities (Tasks) Performed (weighting factor = 4) 43
P&E-T-01 Solution life cycle planning is initiated in the early stages of, and in parallel with, the overall project management planning. 50
P&E-T-02 A solution life cycle with predefined stages of manageable size is identified or defined. 80
P&E-T-03 A solution delivery plan in keeping with the selected life cycle approach is developed according to a documented procedure. 0
Measurement and Analysis (weighting factor = 2) 10
P&E-M-01 Measurements are made and used to determine the status of the project planning activities. 10
Verifying Implementation (weighting factor = 2) 25
P&E-V-01 The activities and products of project management planning are reviewed with company and client management on a regular basis. 25
Range: Evaluation: synonyms
blank Not Applicable: not evaluated
0 -35 Poor: nonexistent, inappropriate, ineffective, inadequate, irrelevant
36 - 65 Insufficient: informal, unwritten, sporadic, partial, sometimes, in process
66 - 85 Acceptable: implemented, compliant, in practice, done, prepared, complete
86 - 100 Outstanding: superior, exemplary, world-class, best-of-breed
  A score of 80 points indicates fully and effectively implemented.

Proceedings of PMI® Global Congress 2003 – North America
Baltimore, Maryland, USA ● 20-23 September 2003



Related Content

  • PM Network

    Migración médica member content open

    By Wilkinson, Amy Las decisiones de vida o muerte a veces dependen de si los médicos y enfermeras tienen acceso seguro y rápido a los registros de los pacientes. Eso es especialmente cierto para Adventist Health, uno…

  • PM Network

    Perspectiva Premium member content open

    Poner al cliente en primer lugar en el sector financiero significa construir una mejor experiencia digital en toda la empresa, una transformación que la corredora de seguros europea Génération…

  • PM Network

    Acceso remoto member content open

    By Wilkinson, Amy Arabia Saudita alberga algunos de los campos petroleros más remotos del mundo, y la reserva Shaybah de Saudi Aramco no es una excepción. Ubicado en las profundidades del desierto de Rub ’al-Khali,…

  • PM Network

    Construido para la velocidad member content open

    By Wilkinson, Amy El acceso a Internet de alta velocidad puede cambiar las reglas del juego para los países en desarrollo. Y en Sri Lanka, la necesidad era fundamental. Décadas de amarga guerra civil dejaron a la…

  • PM Network

    Built for Speed member content open

    By Wilkinson, Amy High-speed internet access can be a game-changer for developing countries. And in Sri Lanka, the need was critical. Decades of bitter civil war left the island nation lagging far behind other parts…