Project Management Institute

Opening the book on the open maturity model

by Shay Lubianiker, PMP

img

LIKE PEOPLE, ORGANIZATIONS of all types spend their lives in two types of activity: projects (time-limited, nonrecurring operations) or ongoing operations. Business organizations survive and make their profits by virtue of two elements: the technology and/or know-how that they possess and use in order to create their products and services, and the quality and ability of their management. Both elements are essential in today’s competitive environment.

Many management methodologies have been proposed, all of which have attempted to improve the second element—management quality. This article proposes a holistic, feasible approach called the Open Maturity Model.

In software project management, the term Maturity Model was coined in the late 1980s by the United States government and the Software Engineering Institute (SEI) at Carnegie-Mellon University. The government sought to develop a tool that would enable it to predict a success in developing software and that would compare a software vendor’s management quality with that of other vendors (a procedure commonly known as benchmarking). It was not sufficient that the proposal was satisfactory. The government wished to verify that the supplier’s normal business conduct was structured so that it could complete the project successfully.

In 1991 the SEI published the Capability Maturity Model (CMM), and it became a requirement in contracts between software developers and the U.S. government. This model has provided the basis for hundreds of certifications in the United States and around the world.

The success of this approach in improving software project management quality, together with the increasing global trend of managing organizations by projects, has led to the development and introduction of several generic maturity models in the world of project management.

Problems With Existing Models

The approach adopted by CMM and other available models, however, is closed—that is, the models provide structured objective criteria that must be met at certain levels of maturity based on achieving specific goals.

The closed approach presents several problems:

Inflexibility. Organizations differ from one another, and the definition of project management in one organization is not necessarily the same in another. Following quality management principles, a basic tenet is that changes may be needed on occasion according to specific organizational goals and objectives and to meet specific customer requirements.

The Time Axis. The closed approach leaves no room for the organization’s specific definitions of project management processes, which change continually throughout the organization’s existence. Over the last decade, the pace of such changes has been one of the most outstanding characteristics of management constraints and methods (for example, changes in information technology, outsourcing, globalization). The assimilation of any new management system, project management methodology, policy, procedure, new standards employed by the organization, and any approaches such as the Theory of Constraints are all components of the organization’s project management process.

How to Improve (1). Most models refer to the “what” (“which” process is incorrect), rather than to the “how” (how to improve what is inadequate). The question “How?” requires the organization to dig deeper and to use other methods and tools after the maturity models have been applied.

How to Improve (2). Most existing maturity models separate “maturity” into five levels: Initial, Repeatable, Defined, Managed, Optimized. Examinations carried out over several years have found that 77 percent of organizations are no higher than Level 2, Repeatable (see SEI’s website at http://www.sei.cmu.edu/activities/sema/profile.html), and less than 6 percent are above Level 3, Defined. We may assume that to date the situation is better and more organizations are making it above level 2, but any organization wishing to improve will find it difficult to measure its progress over time on such a crude measuring stick.

The Model: Project Management Assessment 2000

The model presented below is a holistic methodology and software tool for improving management processes in a project management environment that offers a solution to the problems described above. The solution is based on an “open” model that enables the organization to assimilate generic knowledge, tools, procedures and standards common to the world of project management and at the same time to define specific elements suited to each organization's unique needs. This allows the organization to continuously review and improve its management and work practices.

A hierarchy description of the model shows the PMBOK® Guide approach to measuring organizational maturity in project management

Exhibit 1. A hierarchy description of the model shows the PMBOK® Guide approach to measuring organizational maturity in project management.

The first task is to choose an effective, accepted method for dividing the world of management into sections that will be easy to handle. The natural choice is that proposed by the PMBOK® Guide.

The PMBOK® Guide is divided into nine key areas that are further divided into 37 processes. In order to deal with improving the quality of project management for each of these 37 processes, four enablers have been defined that are common to all 37 processes. It is these four enablers that must be examined in detail and improved.

img Know-how. The knowledge, which characterizes all 37 PMBOK® Guide processes, must be controlled to correctly apply each of the processes.

img Tools and Techniques. Individuals participating in the implementation of each procedure must be equipped with these. Each PMBOK® Guide process has methods and tools that the organization must assimilate and control.

img Practices, Standards, and Procedures. These instruct and guide those engaged in the work as to what to do and how to implement each PMBOK® Guide process.

img Organizational Infrastructure. Includes matters of authority and responsibility; adapting the organizational structure; and the physical infrastructure of equipment, means of communication, software tools, and so forth.

Each of the enablers relates to all 37 processes—making 148 enablers in all. The PMA 2000 model assumes that these 148 enablers are composed of two types of elements:

img Generic Elements. A generic element can be know-how, a work process, a standard, a tool/technique, or infrastructure that is necessary for any organization where projects are managed; the PERT/CPM technique for planning schedules is a generic element. And breaking down the project in a hierarchical fashion using the WBS method is a generic element to be adopted in any project management environment.

img Specific Elements. These are different from one organization to another and from industry to industry. For example, when defining the scope of a defense project, a specific element will be MIL Std 881a, which teaches how to build a WBS. A further example: an organization may specify a particular software tool to be used routinely for risk assessment or schedule development and control.

A 10-step process implements project management maturity survey, improvements, and reassessments

Exhibit 2. A 10-step process implements project management maturity survey, improvements, and reassessments.

Level of Detail

PMA 2000 currently contains a bank of more than 450 questions about generic elements distributed among the various areas, processes, and enablers. The software is used to review the organization according to these elements in order to determine the areas in which management practices can be improved and which specific procedures need to be modified. The questions have been constructed carefully so as to provide a detailed reply that will enable the reviewer to discover whether a problem exists, its scale, and how to deal with it. For example, one of the 450 elements examined looks like this:

Area: Project Cost Management

Process: Cost Estimating

Enabler: Practices

Element: Does the structure of the cost estimates enable the summaries to be prepared according to:

img Activities

img WBS

img Organizational structure

img Responsible individuals

img Types of expenses

img None of the above.

If the replies to all items in the element are positive (excluding the last one), the element does not require improvement. If some of the items have not been implemented, there is room for immediate improvement in the performance of this element.

Automation of the Review Process

Special software enables the review to be carried out rapidly across a computer network, in a focused fashion, and allows real-time tracking of replies. Different questionnaires are prepared from the question bank and matched to respondents. Not all participants are questioned about all the elements, and different questionnaires may be given to different participants. For example, there is no point in asking purchasing personnel to answer questions on configuration management. Once the questions have been answered, they’re analyzed. Data may be presented in graphic and tabular form for each possible cross-section at the organization level (person, project, department, division, and so forth); the area level; or the process, by the type of problems (enablers) and the extent of using a specific element correctly.

The automation of this process and the richness of detail that it provides allows the review to be carried out with considerably less effort than that required by a frontal review undertaken by an army of surveyors or assessors. The organization’s experts examine the outputs. They may recommend that a surveyor perform a frontal review at specific points where discrepancies are found in replies to identical questions or for elements where the reliability of the replies requires examination by a personal interview.

Implementing PMA 2000—Work Process

The methodology is implemented in a 10-stage process, as shown in Exhibit 2.

Stage 1 is an undertaking by the organization’s management to commit itself to the process. Without this, it will be difficult to implement the review’s findings, and this will lead to a waste of resources and unnecessary disquiet within the organization.

Stage 2 is preparing the organization for the assessment and for implementing change. This requires a referent among the senior management, a project team to conduct the review, and a project manager. The team should include a variety of experts from the organization itself (and occasionally from a consulting company). These experts will determine, each in their area, the project management standard. This standard takes into account all the relevant generic elements (some will not be relevant) coupled with elements unique to the organization. The assimilation of both generic and specific elements improves project management practices within a certain process in a certain area.

Stage 3 is calibration. At this stage any generic elements that are irrelevant to the organization will be left out, and the organization’s specific elements will be added. The team of experts will define the specific elements. Elements will be defined to cover all necessary procedures, methods and tools, standards, knowledge, and required infrastructure. This activity creates a baseline for the project management practice within the organization. At this stage, the various processes and elements are weighted differently, according to expert opinion.

Stage 4 is review preparation, when it is decided who participates in the assessment. Our experience shows that 20–30 percent of the organization participates. These people represent those who play a part in planning the project and managing the implementation processes. With the help of software, questionnaires will be prepared for the participants, tailored to each, and will include the elements that a particular person must be in control of at the required level (not everyone is required to master all elements).

Stage 5 is the actual review in which the participants reply to computerized questionnaires. Before beginning this stage, all those involved must meet and prepare.

Stage 6 is a preliminary analysis and verification of the results of the review. The system enables highlighting of discrepancies between replies, thus revealing the need for a more in-depth examination. In some cases the additional examination will be frontal, by means of an interviewer who will discuss the issue with the examinee, while in other cases the expert opinion of an authorized person within the organization will be sufficient. The end of this stage is the preparation of a report, to be presented to management, of the problems, ranked in order of importance, together with a list of proposals for solving them. Problems are sorted into areas, processes, and enablers, and prepared for presentation.

Stage 7 is a management review and discussion during which the findings are presented and alternative solutions are proposed. At this meeting (or series of meetings) the order of priorities proposed by the project team for solving problems will be discussed, as well as alternatives, necessary resources, and timetables for handling the problems. This meeting will decide who is to be responsible for implementing the necessary improvements.

Stages 8 and 9 are decisions to be made concerning the plans for implementing the improvements. The system records the improvement plans according to the person responsible, distributes the plans to the various people responsible, monitors progress in the implementation, and reports when the process has been completed.

Stage 10 is a reassessment of the elements after the improvements have been implemented. This will show where gaps have already been closed, the pace at which they are being closed, where new gaps have opened, and where recurrent or new problems must be evaluated.

Continuous Improvement of Project Management Practices

Continuous improvement of project management practices is made possible by removing from follow-up reviews all the elements that have already been assimilated and implemented, and by adding new elements of project management practices to be assimilated within the organization (new procedures, project management information systems, work techniques, or standards). Preparing a review of this type and repeating the process presented above allows the organization to constantly push its project management practices forward and to adapt itself to necessary changes on an ongoing basis.

The PMA 2000 methodology allows an organization to significantly improve its project management practices by involving the entire organization in a revolution detailed and planned down to a predefined level for each process. Repeated reviews allow new elements to be added to the system in an ongoing fashion. This approach, which views the maturity model as an open, flexible tool, can be modified in line with needs, rather than being employed as a benchmarking tool. Thus, it allows the organization to adopt a TQM approach and to continuously push its project management practices forward. ■

Reader Service Number 009

Shay Lubianiker, PMP, president of Leshem-Nituv Engineers (www.leshem.co.il), is a senior consultant, trainer, and lecturer in project management, and founder and vice president (1997–1999) of the PMI Israel Chapter.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI.

PM Network March 2000

Advertisement

Advertisement

Related Content

Advertisement