Improving project system performance through benchmarking

Introduction

Benchmarking is two things: setting goals by using objective, external standards and learning from others (Boxwell, 1994). Goal setting theory states that goals affect performance by forcing attention on goal-relevant factors and helping regulate energy. In addition, setting “hard” goals that are accepted increases persistence (Locke & Latham, 1985). Benchmarking provides the external standards needed to set achievable goals that are both challenging and accepted by those responsible. However, benchmarking is more than just setting goals. Setting goals that are best in class is useless unless you also learn how the best companies achieve their performance (Boxwell, 1994). Thus, benchmarking also involves learning from others and changing existing processes and methods accordingly.

So why benchmark? As Boxwell states, “Because it makes so much sense, that's why” (Boxwell, 1994, p. 38). Benchmarking is deceptively simple: identify the best, study and learn from them, and implement improvements that will work in their situation based on the learning (Boxwell, 1994). Benchmarking is established as an important tool in manufacturing and its not surprising that its use has expanded to other types of industries (Emhjellen, 1998). So what about benchmarking the project management process?

Benchmarking Project Management

Benchmarking is just as applicable to project management as it is to other endeavors. Project management has always applied goal setting theory. Almost every project establishes cost and schedule targets. In addition, projects set other goals such as quality, customer satisfaction, safety, and operational performance. Given project management's use of goals, identifying goals as part of benchmarking should be easy to adopt. The second element of benchmarking, learning from others, is also applicable to projects. Project management training courses and certification programs indicate that improvements can be gained by learning from others and that practitioners are open to learning by means other than direct experience.

However, in practice, benchmarking project management is difficult to implement due to the following:

  • Deciding what to measure is not always clear. Different projects have different drivers and priorities. There is no single measure of project success that can be universally applied as a basis for benchmarking.
  • Obtaining reliable competitor data is also difficult. Project performance data are normally confidential and most organizations are reluctant to share those data with other organizations in a benchmarking partnership.
  • Identifying the right learnings can also be challenging. Projects are complex enterprises that often involve multiple stakeholders with conflicting objectives. In this environment, it is difficult to determine the critical learnings from one project that should be implemented in a new project.
  • Maintaining a benchmarking process requires discipline and resources. Organizations that are focused on executing projects normally do not have expertise in benchmarking. In addition, when the workload increases and resources are limited, these organizations will often abandon benchmarking and focus on short-term demands.

Despite these barriers, project management benchmarking is underway. A search of the PMI web site returns 86 articles related to benchmarking. Practitioners, consultants, and academicians recognize the power of benchmarking and its potential to improve project performance.

Approaches to Project Management Benchmarking

A literature review found that references to project management benchmarking fell into three categories: maturity models, single focused studies, and web-based benchmarking.

Maturity Models

The most frequently cited approach to project management benchmarking is through project management maturity models. Project management maturity can be described as the organization's receptivity to project management (Suares, 1998). Several different maturity models have been introduced to the project management community (Pennypacker & Grant, 2003). In fact, the Project Management Institute (PMI®) has published one such model (PMI, 2003). These models attempt to measure an organization's level of project management maturity through a rating system based on the extent that different practices, processes, and skills are in place.

Different consultants and organizations use these models as bases for benchmarking the maturity of, or receptivity to, project management. However, this approach is primarily focused on practices or processes and does not provide reliable outcome performance metrics necessary to gauge the effect of implementing the learnings. Two published articles noted that the field of project maturity models is relatively young and lacks the empirical support for determining which competencies contribute most to project success (Jugdev & Thomas, 2002, and Skulmoski, 2001).

Focused Studies

Another approach to benchmark project management is through a one-time study of a group of projects. One example of this approach focused on project management practices in electric utilities and was reported to the PMI Annual Meeting in 1995 (Brunner, McLeod, & LaLiberte, 1995). In this study, 151 projects were evaluated. The researchers identified insights into project management and suggested strategies to improve project performance. However, this was a single study that was completed in 1995 and no follow-up work has taken place.

Web-Based Benchmarking

Another approach is to benchmark through a web-based interface managed by a third party. The third party acts as a facilitator of the benchmarking process and maintains a database used to establish norms. An example of this approach is the ongoing benchmarking efforts of the Construction Industry Institute (CII) in the United States. The CII is a privately funded research institution located at the University of Texas in Austin, Texas, that has developed a web-based benchmarking system for the construction industry (Lee, Thomas, & Tucker, 2005). Member companies contribute data on individual engineering and construction projects through a web portal. As of September 2005, the database included 1,420 projects from different sectors of the construction industry including building, heavy industrial, infrastructure, and light industrial (CII, 2005).

However, the scope of the CII benchmarking does not include detailed, judgmental analysis of individual projects or company performance. CII benchmarking via the web provides only computer-generated reports on performance and practice use, norms for comparison purposes, and reports of industry analyses (Lee, Thomas, & Tucker, 2005).

Benchmarking Consortium

Another approach to project management benchmarking is through a cooperative effort in the form of a benchmarking consortium. A consortium is a cooperative agreement among groups or institutions. Therefore, a benchmarking consortium is a cooperative to benchmark. The consortium approach adapts the fundamentals of benchmarking and addresses the special needs of benchmarking with organizations that are in direct competition (DeVito & Morrison, 2000).

A consortium typically needs an independent third party to administer the process. The most important role of the third party is to maintain benchmark data and guard the confidentiality of member information. The third party must also provide the technical expertise and conduct appropriate development work to expand the capabilities and relevance of the program. Finally, the third party must be a reliable partner that remains focused on benchmarking, irrespective of the business conditions and resource constraints of the member organizations. The following section of this paper presents one example of a benchmarking consortium. It provides details regarding the benchmarking process followed, the databases and analysis tools used, and the benefits to member organizations.

Industry Benchmarking Consortium

Since 1990, Independent Project Analysis, Inc. (IPA) has served as the independent third party for the Industry Benchmarking Consortium (IBC). IPA is a project management research and consulting firm based in Ashburn, Virginia, in the United States. The IBC is a benchmarking consortium focused on capital projects in the process industries. Capital projects in the process industries involve the construction of physical plant facilities and materials processing equipment, either to produce a new product for expected profit or to maintain or develop operating capabilities (Scott-Young, 2004).

The membership to the IBC is limited to owner organizations that are responsible for the planning, design, construction, and startup of industrial capital projects. The IBC has approximately 40 member companies that come from a wide range of industries, including oil exploration and production, petroleum refining, chemicals, mining and minerals, pharmaceuticals, consumer products, and forest products. All members agree to adhere to a formal benchmarking code of conduct, which specifies requirements and expected behaviors. IBC member companies include Alcan, Alcoa, BHPBilliton, Borealis, BP, Degussa, DuPont, Dow, ExxonMobil, Merck, Sanofi Pasteur, Shell, Statoil, Wacker, and several other owners of capital intensive manufacturing facilities.

Methodology

IPA's methodology for benchmarking capital projects involves building carefully normalized project databases. Data are collected in face-to-face interviews using a structured questionnaire, which gathers information about project objectives, scope, technology, costs, schedule, and project management practices. Over 2,000 different data elements are collected as well as copies of key documents such as cost estimates and schedules. Typically, these interviews are conducted at two points in the project life cycle: (1) at the time of project authorization and (2) at the end of the project after mechanical completion and startup (Griffith, 2005). The data collected in these interviews are then translated into relational databases, which are the primary tools used for measuring performance and identifying Best Practices.

Using these databases, IPA develops statistical models, builds comparison groups, and conducts research. Statistical models are used to determine industry average performance for several different outcome metrics. They are used to determine the absolute performance of a specific project versus industry benchmarks. In addition to statistical models, IPA compiles comparison groups of projects with similar characteristics that are used to determine industry averages for specific performance metrics and to validate the statistical models. The databases are also used for basic research. The IBC has sponsored over 100 different research studies. The results of these studies are then fed back into the benchmarking process.

With these analysis tools and research, member companies benchmark their individual projects and project systems against specific comparison groups. Member organizations systematically add projects to the database and each project is evaluated. Outcome performance metrics and the application of Best Practices are benchmarked. Consortium members receive feedback with a detailed explanation of the results, conclusions, and recommendations. In addition, a company may have all of its projects compiled into a single project system analysis, which looks for overall system trends and learnings.

This approach provides member companies with an analytically robust platform for benchmarking on a continuous basis. The analysis tools provide reliable external standards for setting competitive, but achievable, goals on both an individual project basis and on a system basis. In addition, the research studies and best practice metrics provide companies with the critical learnings that make it possible to achieve the goals they have set.

Database Characteristics

The projects in IPA's databases cover a wide range of industries, types, sizes, technologies, and characteristics. The consortium benefits from project databases focused on upstream exploration and production, information technology, buildings and civil works, small projects, and extremely large projects. There is also a benchmarking database for plant shutdown and turnaround projects. IPA's primary database, the Downstream Process Plants

Database, is made up of projects in the downstream process industries such as refining and chemicals and has the following characteristics:

  • More than 8,000 different projects in total
  • Projects from over 275 different organizations including large international owners and smaller companies located in only one country or region
  • Average estimated cost of $39 million (USD) with a range from under $100,000 to over $4 billion
  • Projects located in 78 different countries from all over the world
  • All project types including revamps (40%), colocated (15%), expansion (14%), add-on (20%), greenfield (7%), and other (4%)
Database Normalization

The database and analysis tools are carefully normalized in order to achieve an “apples to apples” basis for comparisons. All cost data are converted to a single currency and de-escalated to a common year. Appropriate location adjustments based on local labor rates and productivity factors are made as part of every project evaluation. Costs and schedules are adjusted for any external factors such as strikes or extreme weather. In addition, the statistical models are designed to control for project characteristics such as size, technology, and scope. When applied to a specific project, the statistical models adjust for the unique characteristics of the project and produce an industry average benchmark for comparable projects along the range of the model distribution.

Database Confidentiality

In order for the consortium to function, confidentiality is paramount and consortium members must participate in the processes. The databases contain sensitive company data and a critical role of the third party facilitator is to protect the information. Confidentiality is strictly maintained through different layers of protection. All participants in the benchmarking consortium agree to a strict confidentiality agreement. IPA employees also sign a comprehensive confidentiality agreement. Databases and related files are maintained in a secure network with tight control over access and use.

Project Evaluations

IBC member companies receive detailed feedback on project benchmarking evaluations. The feedback reports detail a number of different outcome metrics because there is no one universal measure of project success. Competitive project systems must consider all project performance metrics and consider the balance between priorities. In addition, the feedback reports also provide detailed evaluations of the results, which assist in understanding root causes and identifying appropriate learnings. The detailed feedback provides the industry average performance and best in class performance for comparable projects, which can serve as stretch goals. The reports also give benchmarks for applicable subgroups such as a specific industry sector. In addition, the reports include metrics on the application of Best Practices as a basis for learning how the best organizations achieve superior results. Performance metrics include both absolute and predictability performance. The different performance benchmarks typically reported are presented in Exhibit.

In addition to the standard metrics outlined in Exhibit 1, evaluations provide detailed analysis of the findings, which help to explain why individual projects achieved the measured results. For example, in addition to determining a cost benchmark through the application of the appropriate statistical model, a detailed analysis is typically done on the various cost ratios (total cost/equipment costs, engineering costs/total cost, labor costs/equipment and bulk materials costs). These ratios are compared with the ratios of comparable projects from the database, which often indicate the cost category for a particular project that differs from the average. In terms of schedule, a detailed analysis of the benchmark results can evaluate individual project phases and overlaps between phases to better explain superior or inferior performance.

Project Benchmark Metrics

Exhibit 1 – Project Benchmark Metrics

IBC Conference

Every year the consortium holds a conference for member companies that is intended to be a free exchange of ideas and learnings. Because many companies involved are direct competitors, this conference is governed by a strict code of conduct that is designed to avoid any discussions or actions that might lead to or imply an interest in restraint of trade, market customer allocation schemes, dealing arrangements, bid rigging, bribery, or misappropriation. Each member company has agreed to share performance metrics and practices in a cooperative effort to improve their project systems. In addition, many companies volunteer to deliver presentations focused on Best Practices, case studies, or lessons learned. During this conference, many consortium-sponsored research studies are presented.

However, the most attention is given to the presentation of company metrics when the outcome and the input performance metrics for each company are presented to the entire audience. Company logs are used as markers on presentation slides to show the relative performance of each member for a given metric. The results are friendly competition, goal setting, and critical learnings.

Exhibit 2 is a sample graph from this conference with company identities masked. The graph presents each company's average absolute cost performance reported as an index with industry average anchored at 1.0. A result higher than 1.0 indicates a cost effectiveness worse than Industry, and a result less than 1.0 indicates performance that is more effective than Industry. The entire sample is also divided equally into five groups of companies, or quintiles. The horizontal axis shows the average Front-End Loading Index (measure of the extent of critical definition work completed prior to authorization). Companies that systematically complete better levels of definition prior to authorization also tend to achieve more competitive cost performance. Data like these drive home the importance of Best Practices and show which companies are more successful at applying these practices.

Relationship between Front-End Loading Index and Cost Effectiveness

Exhibit 2 – Relationship between Front-End Loading Index and Cost Effectiveness

Benefits

The benefit to members is the ability to continuously benchmark the effectiveness of their project delivery systems. They can measure their project system against the industry average and selected comparison groups. Members are also able to benchmark themselves against some of the world's leading industrial owners. Using these comparisons, companies are able to identify areas of weakness that should be reinforced and understand areas in which they already excel and need to maintain. In addition, members have the opportunity to learn about the Best Practices that drive superior performance. Companies take this information and develop plans for implementing the learnings with the goal of improving capital effectiveness through better execution of these capital projects.

Conclusions

When applied correctly, a consortium is an effective approach to benchmarking project management. This approach can overcome the barriers noted in the introduction of this article and address the shortcomings identified with the other approaches cited. Specifically, the consortium approach provides the following advantages:

  • A benchmarking consortium is able to provide a wide range of metrics that can be used to measure effectiveness. The wide range of metrics is useful in evaluating the balance between conflicting objectives and priorities inherent in complex projects. This collaborative approach makes detailed data collection, analyses, and feedback possible, which generates a wider range of benchmarks. The detailed and tailored feedback process enables the company to compare specific project and system priorities with the benchmarking results.
  • A benchmarking consortium allows for comparisons based on confidential and sensitive information from other organizations, including direct competitors. Organizations are more likely to share detailed data if they are confident that confidentiality will be maintained, which is a critical role of the third party facilitator in a benchmarking consortium. Face-to-face interviews, clearly specified confidentiality agreements, and sound data management all make it possible to learn from what is normally unavailable proprietary information.
  • A benchmarking consortium is better able to identify the critical drivers of success because of the amount and richness of the data that can be obtained through the face-to-face interviews. The data provided by members make it possible to conduct comprehensive analyses and research, which can quantitatively demonstrate the contribution of specific practices. Because the consortium is an ongoing entity, data can be compiled and analyzed over a number of years. With more detailed and complete data, it is possible to discover more drivers of superior project results.
  • A benchmarking consortium provides the ongoing continuity and expertise, which is often missing in the individual member organizations. Third party facilitators provide the expertise and maintain the databases and analysis tools. Individual companies are free to deal with business cycles and resource limitations without falling into the trap of abandoning benchmarking because of short-term demands.

References

Boxwell, R. J., Jr. (1994). Benchmarking for Competitive Advantage, New York, NY, USA: McGraw-Hill Professional Publishing.

Brunner, W., McLeod, D., & LaLiberte, K.J. (1995). Benchmarking Provides Insights on How to Improve Project Management Performance. Proceedings of the PMI 1995 Seminars & Symposium, New Orleans, LA, USA, 16-18 October 1995, pp. 736-741, Newtown Square: Project Management Institute.

Construction Industry Institute (CII). (2005, September) CII Product Implementation Workshop: Orientation, CII Product Implementation Workshop, Austin, TX, USA, 14-15 September, 2005.

DeVito, D. & Morrison, S. (2000) Benchmarking: A Tool for Sharing and Cooperation. The Journal for Quality & Participation, Fall 2000, 56-61.

Emhjellen, K. (1998) Improving Project Management through Benchmarking. Proceedings of the PMI 1998 Seminars & Symposium, Long Beach, CA, USA, 9-15 October 1998, pp. 1123-1126. Newtown Square, PA, USA: Project Management Institute.

Griffith, A.F. (2005). Scheduling Practices and Project Success. 2005 AACE International Transactions, New Orleans, LA, USA, 26-29 June 2005, Paper PS.05, Morgantown, WV, USA: Association for the Advancement of Cost Engineering International.

Jugdev, K. & Thomas, J. (2002). Project Management Maturity Models: The Silver Bullets of Competitive Advantage? Project Management Journal, 33(4), 4–14.

Lee, S., Thomas, S.R., & Tucker, R.L. (2005) Web-Based Benchmarking System for the Construction Industry. Journal of Construction Engineering and Management, 131(7), 790-798.

Locke, E.A., & Latham, G.P. (1985) Goal Setting for Individuals, Groups, and Organizations, Chicago, Illinois: Science Research Associates.

Pennypacker, J.S & Grant, K.P. (2003) Project Management Maturity: An Industry Benchmark. Project Management Journal, 34(1), 4–11.

Project Management Institute (PMI). (2003) Organizational Project Management Maturity Model (OPM3): Knowledge Foundation, Newtown Square, PA, USA: Project Management Institute.

Scott-Young, C., & Samson, D. (2004) Project Success and Project Team Human Resource Management: Evidence from Capital Projects in the Process Industries. Proceedings of the PMI Research Conference, London, England, Paper RC04SCOTT, Newtown Square, PA, USA: Project Management Institute.

Skulmoski, G. (2001) Project Maturity and Competence Interface. Cost Engineering, 43(6), 11–18.

Suares, I. (1998) A Real World Look At Achieving Project Management Maturity. Proceedings of the PMI Seminars & Symposium, Long Beach, CA, USA, 9-15 October 1998, pp. 362-366, Newtown Square, PA, USA: Project Management Institute.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

©2006 Andrew Griffith, PhD, PE, PMP
Originally published as part of 2006 PMI Global Congress Proceedings – Madrid, Spain

Advertisement

Advertisement

Related Content

Advertisement