Developing and applying a project management capability maturity model

application of a process-based maturity model

Milt Jones, MBA, PMCC Inc.

Introduction

Given the renewed focus on project management as the delivery system for corporate capital and maintenance expenditures, it is natural for organizations to seek a method of ascertaining the maturity of their current project management processes. The authors have developed a Project Management Capability Maturity Model (PMCMM) and applied it within a major energy company. This process has gained the endorsement of the company's management and “buy-in” by the rank and file employees. Even more to the point, and used in conjunction with the requisite training and reinforcement, the PMCMM has helped to produce significant positive results within only a few months after initial application. This paper will (1) outline the development of this measurement tool that the authors have recently employed to help organizations assess their current status, (2) indicate how “go-forward” plans were developed, and (3) indicate the potential application of this PMCMM toward other projects.

Initial Tasking

The initial tasking for the consulting team was to:

1. Learn the organization and processes of the client

2. Use that knowledge to develop an objective evaluation tool (PMCMM)

3. Use that tool to objectively evaluate the client's then-current project management processes and infrastructure against measurable criteria

4. Provide a written report and review with the client

5. Recommend changes to both infrastructure and process “to get the Project Management Office (PMO) to the next level,” a goal that was initially left largely undefined by specifics.

The chosen methodology was, in simple terms, to use a supporting questionnaire to learn about the “inner workings and hidden mechanisms” of the client's PMO process and supporting infrastructure while concurrently developing the basic tool (PMCMM) for a more detailed and objective analysis. It was subjectively decided that this consulting application could make best use of an anonymous personnel interview process consisting of proven and industry-standard questions graded on a five-point scale, ranging from “strongly agree” to “strongly disagree.” In PMCC's recent experience on several similar projects, this time-proven methodology, when mathematically correlated, would be likely to provide the most accurate and useful information regarding both the process and the organizational “norms” with the least risk of significant data error.

Learning About the Project Management Organization and Support Structure

Our consulting firm, PMCC used the “tick” process (burrow under the client's skin) to learn as much as it could about the energy company's PMO and supporting infrastructure's organizational procedures and norms, as well as the existing process. And it should be fairly stated that there was a well written existing project management process that had been put in place by a previous consulting team. However, it became quickly and starkly apparent from the very first of the 50 interviews and questionnaire results that the process had never been properly implemented. (It has been the authors consistent observation that poor implementation has almost always been the “Achilles heel” of good planning.)

Nor had the project management process been comprehensively rolled out to all project management process stakeholders, so there was an obvious communications and training problem. As the interview and questionnaire data was further collected, analyzed, and collated, it became apparent that the client's organization and processes could be easily “mapped” against the PMBOK® Guide's Knowledge Areas and that the various and specific capability levels of the PMO could probably be classified into quantifiable attainment levels. Thus, this became a classic exercise for potential application of a Project Management “capability maturity model.”

The operative question then became, “Is there a known existing project management capability maturity model that could be effectively adapted for this client?” and if so, “Which one?”

As background, probably the most famous of the various Capability Maturity Models is that developed by the Software Engineering Institute (SEI) at Carnegie Mellon Institute. For those interested in information technology (IT) applications, additional information on this model can be obtained from their website at http://www.sei.cmu.edu/ or from the referenced book (SEI, 1994). But we were working with project teams that work in heavy industry, not software, so the SEI model didn't seem to fit very well without extensive adaptation and rework.

There has also been some excellent Capability Maturity Model work done by others and published under the auspices of the Project Management Institute (PMI®). One of the best sources for information on the process for development of such models is the work of Kwak and Ibbs (2000 and other years), most specifically their article in the referenced Project Management Journal article. One of the most comprehensive sources for generic reference materials and a very comprehensive research list appears in the Proceedings of the Project Management Institute's Annual Seminars & Symposium (Rosenstock, Johnston, & Anderson, 2000).

Other authors (Goulet, 2000) have also treated the software Capability Maturity Model and the Program Management Capability Maturity Model (Moore, 2000) in articles associated with the PMI. Again, however, these didn't seem to pertain to our client's projects with the requisite level of specificity.

A recently published book (Kerzner, 2001) might have saved us a great deal of time, had it been published a year or so earlier. Here are some comments regarding the Kerzner methodology and some significant differences from the methodology we employed:

• While Kerzner uses different terminology to define the various “levels” of maturity (“Common Language,”“Common Processes,” “Singular Methodology,” “Benchmarking,” and then “Continuous Improvement” at the top), the overall methodology was very nearly the same as that employed by PMCC. However, it should be noted that our “Level 3” is a fully functional project organization (as is that of Kerzner), but our Level 4 is a “program organization,” where his Level 4 is used to “benchmark” a “project organization.“While we would probably have been able to use the lower three levels of the Kerzner model almost without change, we would still have had to keep in mind that we were essentially establishing a PMO and putting together “go-forward” plans to head for Level 4 “Program Management,” instead of Kerzner's Level 4 “Benchmarking” of individual projects. Under our five-level scheme, we would probably have recommended the function of “benchmarking” be part of Level 5, Kerzner's “Continuous Improvement,” but part of our “Managing Excellence.”

• Kerzner does provide a very useful “check list” of questions for each maturity level and postulates that, while there can be some overlap between certain phases in the organization's progression towards project management maturity, “… the order in which the phases are completed cannot change.”

• Another interesting point in this book is the postulated differences in difficulty required to accomplish “maturity” at each level. For instance, to accomplish “Common Language” (Level 1) and “Common Processes” (Level 2) are perceived as “medium” difficulty, while accomplishing “Singular Methodology” (Level 3) is perceived as a “high” degree of difficulty.

“The bottom line” was that, at the time it was needed, no “off-the-shelf” or readily adaptable Capability Maturity Model for Program Management was found that would have seemed to satisfactorily portray this client's organizational structure and process so that it could be objectively measured and rated using quantifiable and repeatable criteria. So it was decided to construct our own model.

Development of the Tool (PMCMM)

Our company uses the PMBOK® Guide as a primary reference for instructing and consulting in all areas of project management. It would therefore seem logical to structure our Maturity Model along the lines of what we teach. That is, define the rating categories and then shape the individual (quantifiable and measurable) rating attributes, taken from the PMBOK® Guide, into those same categories, or “buckets.”

First, we decided on the “Ratings” categories: A number of excellent ratings categories (or “attainment levels”) already existed. A very useful rating system that we particularly liked was described in the previously cited work by Kwak and Ibbs (2000 and other years). That system, however, didn't quite seem to fit (or precisely describe) the wide range of conditions, or “levels or comparative excellence,” that we were finding within the client's PMO and its supporting infrastructure functionalities (e.g., Planning, Project Services and Support, Design, Field Construction, etc.). There was also the question of “managing projects” as opposed to “managing programs.” We therefore equated to (and defined) the “worst case” Level 1, or category, as “Crisis Management,” the next-best Level 2 as “Reactive Management,” the expected and hoped-for median at Level 3 as actual “Project Management,” the next upward (Level 4) as “Program Management” and the very best attainment category, or Level 5, as “Managing Excellence.”

The next step was to categorize the general characteristics of each level of our PMCMM, using the PMBOK® Guide's Areas of Knowledge. As just a few examples, at Level 1:

• Good “crisis managers” may pass for project managers.

• The quality of the finished project is entirely dependent on the character and experience of the “project manager.”

• No formal procedures or plans exist for the project.

• There is no “buy-in” process for the project team members.

• Project estimating is done at a high level and is often broken down by accounting codes instead of by project deliverables.

• Most project schedules are “pretty pictures” with no baseline contrasted against current status.

• Project controls for ongoing projects are weak and mainly consist of assessing percent complete for the project at very high-level breakdowns.

• Only reporting invoiced “actual” costs to date and mistaking that for exercising satisfactory Cost Control.

• There are no project quality standards for the project management process itself.

• Overall project performance is highly erratic.

Contrasted against this potentially very ugly (Level 1) picture of continual crisis management stands the criteria by which an organization or process might be rated as performing at Level 5. This level or category could be deemed as “Managing Excellence” to the degree that the project management programs have many attributes that might be exportable as “best in class.” Some of these “world class” attributes are:

• Focus on continuous improvement, training, coaching, and mentoring of the project management staff.

Exhibit 1

img

• Particular attention is given to identifying and addressing project management problems that cross organizational boundaries and require management action.

• Emphasis is given to the review of new and upgraded project management software that will improve the speed and efficiency of project data collection and analysis.

• The goal of “best-in-class” project management methodology is at the forefront of all stakeholder consciousness.

• Management actively champions the project management process.

• Project estimates incorporate appropriate stakeholder input, risk assessments, and lessons learned from the PMBOK® Guide elements for scope, schedule, cost, human resources, quality, procurement, risk, communications, and project integration.

Level 2 (“Reactive Management,” Level 3 (“Project Management”) and Level 4 (“Program Management”) were then correspondingly structured to complete the five-column matrix. The overall result is shown in Exhibit 1.

Evaluating the Project Management Organization and Processes

Once the measurable and repeatable criteria were established, the next task was to compose a rating scale with which to measure PMO performance (and that of its supporting infrastructure), as described above. A five-point scale was assigned to each attribute, with “Crisis Management” rating a grade of 1 and “Managing Excellence” rating a grade of 5. Obviously, the higher overall score indicates a better rating.

Not all potential PMBOK® Guide elements (or attributes) were weighted equally, in that some questions were asked more than once, albeit in a different format. Here are just a few examples of the PMCMM rating questions (there were forty questions in all, with about equal numbers of questions from every PMBOK® Guide Knowledge Area):

• Is “buy-in” to the process consistently obtained from all stakeholders, as well as from project team members and others who will perform the work?

• Is project estimating done at a high level and is it broken down by accounting codes instead of by project deliverables (Work Breakdown Structures—WBSs)?

• Are “standardized” WBSs in general use?

• Are estimating and scheduling done using the same WBS as is used for execution of the work?

• Are costs and schedules integrated through the WBS and proactively tracked, controlled, and forecast?

• Are most project schedule updates simply “pretty pictures,” consisting mainly of assessing percent complete for the project at very high-level breakdowns with no baseline contrasted against current status?

• Is project status data that is provided to the team timely enough to allow for proactive analysis?

• Is project status used mainly only for historical/reporting purposes?

• Is Earned Value (EV) used to ascertain the progress of the work via a set of predefined metrics?

• Is a formal change management system implemented and is it effective in use?

• Does significant “scope creep” occur?

• Are project management processes formalized and encompass methods and practices for planning and controlling multiple projects?

• Are the project management processes well defined, quantitatively measured, understood, and executed?

All of these questions were sorted into PMBOK® Guide Knowledge Areas (Integration, Cost, Time, Communications, etc.), graded (scale of 1 to 5, depending on observed level attainment), collated, and presented to the client. Note that this process presents a very valuable “legacy” benefit to the client: Now that the grading process is established with objectively measurable and repeatable attributes and criteria, the client itself can potentially perform the evaluation process (and measure its own rate of progress toward project management maturity) the next time that a “follow-on” evaluation seems advisable or indicated.

Client Interface

The client (PMO) was kept totally involved and engaged, first by presentation and approval of a proposed consultation scope of work, and later by a constantly updated WBS and a schedule (yes, there was considerable “scope creep”) with Earned Value criteria so that the client always knew where it had been, where it was going, and (more to the point!) how much the consulting project was going to cost.

The initial report of the “preliminary” interview/survey was reviewed with the client and it was explained where differences existed between “what was designed” (client's designed process) and “what existed” (what the client's PMO and supporting personnel actually thought and were doing). These differences were further explained in terms of improvement goals that were achievable at various levels. Since PMCC had also performed a statistical analysis, we were also able to present the degree of cor-relation achieved for each individual survey question and for the survey process overall.

The client comments (or “push back”) were then analyzed, responded to, and (where applicable) integrated into the “final” or formal report.

The PMCMM grading was handled a bit differently. For instance, assume the Cost Control functionality was rated at 2.0 (“Reactive Management” level) overall. In order to achieve a 4.0 rating in that category, it was explained that certain infrastructure, controls, and personnel training would have to be emplaced and implemented. If, on the other hand, a Level 5 (“Managing Excellence”) rating in Cost Control was desired by the client, then something more would be required, and the dollar value cost in infrastructure and in manpower resources would logically be expected to be concomitantly higher, and not necessarily in direct proportion.

We would also like to make a brief comment about the significant cultural, logistical, and organizational issues involved in going from a Level 3 “Project Management Organization” to a Level 4 “Program Management Organization.” Some clients will grade themselves at Level 4 (Program Management) without the requisite system tools and methodologies in place. The challenge for large organizations with possibly hundreds of ongoing capital projects and turnarounds/outages is not to practice project management but program management and in order to do so requires, as one example, the application of a common resource pool. This is much easier said than done and only in the last year or so have we begun to see software tools that may do in practice what they promise in theory. The costs, time, and organizational stresses must be addressed in any cost/benefit analysis done to substantiate moving from Level 3 Project Management to Level 4 Program Management. For large organizations where program management is essential, those costs must be accounted for in laying out the plan to become a Level 4 organization.

Influence the Organization: Improve the Process

Once the assessments had been done and a plan identified, the next step was to produce results. Since “implementation” had, in general, been lacking, a series of four-day training classes in Project Management Principles were scheduled and provided to all project managers. Each project manager was also encouraged in writing by his or her supervisor to take the PMI® certification exam.

A condensed (two-day) “Project Management Support” class was then scheduled and rolled out to all personnel who daily interfaced with the PMO (those functionalities included, for example, Planning, Project Services and Support, Design, Field Construction, etc.). From all these classes, direct feedback (class concerns) were collated, carried back, and presented to the client management.

One of the most striking benefits of this overall process was the change in corporate culture. The project managers had been cast by the designed process in the “Strong Matrix” role (PMBOK® Guide, 1996, Figure 2-11, p. 22); this “process definition” notwithstanding, they were actually only significantly involved in the PMBOK® Guide equivalents of the “feasibility” and “planning and design” phases. Although the project managers were defined by the effective process as “accountable” (generally defined as both “being responsible” and “possessing requisite authority”) for costs, there was no infrastructure to provide cost reporting and forecasting, nor did they possess or exercise any significant control over project expenditures. There are many more such examples that could be cited of an observed “delta” between “designed” and “actual” processes, but these will suffice to convey the degree of change in attitude and corporate culture that was required and is even now progressing.

In general, the overall results of the more specific recommendations have been positive, particularly in the area of infrastructure. The client has revamped their overloaded scheduling system and upgraded to a more robust enterprise scheduling tool and work management system that provides the requisite level of response. An automated cost tracking, reporting, and forecasting system has been developed and is being implemented. Material expediting and tracking, once a function belonging almost solely to the overworked field construction engineers and the project managers, has been taken over by a new “material tracking and expediting” functionality matrixed over to the PMO from the Project Services and Support resource manager.

Conclusions

The major advantages of using a Project Management Capability Maturity Model are seen as its objective measurement criteria and its high degree of repeatability. Added to the above, an effective model carefully applied can gain quick and sustainable credibility with either external client or internal management structure, especially if it is carefully and intelligently tailored to suit the existing project management application.

Assuming that no pertinent Project Management Capability Maturity Model can be found on the shelf from readily available sources in the literature, a highly effective model for application in almost any industrial or technical process and/or organization can be easily designed and implemented, given the excellent tools and reference materials now available. Even better, application of the model and objective presentation of quantifiable and verifiable measurement data to either a receptive management or a grateful client can bring results that would satisfy even the most jaundiced of observers.

References

Goulet, Douglas R. 2000. Measuring the Success of Project Management and the Capability Maturity Model Using the Parking Lot Metric. Proceedings of the Project Management Institute Annual Seminars & Symposium. Newtown Square, PA: Project Management Institute.

Kerzner, Harold. 2001. Strategic Planning for Project Management Using a Project Management Maturity Model. New York: John Wiley & Sons.

Kwak, Young H., and William C. Ibbs. 2000. Calculating Project Management's Return on Investment. Project Management Journal (June): 38–47.

Moore, Thomas J. 2000. An Evolving Program Management Maturity Model: Integrating Program and Project Management. Proceedings of the Project Management Institute Annual Seminars & Symposium. Newtown Square, PA: Project Management Institute.

Project Management Institute. 1996. A Guide to the Project Management Body of Knowledge, Figure 2-11, p. 22. Newtown Square, PA: Project Management Institute.

Rosenstock, Christian, Robert S. Johnston, and Larry M. Anderson. 2000. Maturity Model Implementation and Use: A Case Study. Proceedings of the Project Management Institute Annual Seminars & Symposium. Newtown Square, PA: Project Management Institute.

Software Engineering Institute, (1994). The Capability Maturity Model: Guidelines for Improving the Software Process. Reading, MA: Addison Wesley Longman, Inc.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

Proceedings of the Project Management Institute Annual Seminars & Symposium
November 1–10, 2001 • Nashville, Tenn., USA

Advertisement

Advertisement

Related Content

Advertisement

Publishing or acceptance of an advertisement is neither a guarantee nor endorsement of the advertiser's product or service. View advertising policy.