It’s report card time! An approach to measuring and understanding project management behaviors
Corporate America is placing an increasing emphasis on behaviors. Historically, success has been emphasized and rewarded, while the behaviors used to achieve those successes were often downplayed. Today, a shift in thought is occurring. Corporate America observes and measures behaviors, not just as an enabler to success, but also as an additional indicator of success.
Motorola is no exception to this changing environment. Behaviors are stressed as a key part of job performance and integrated into employee and organizational performance assessments. Adapting to this change has been a challenge for Motorola's project organizations, where success has traditionally been measured by meeting commitments to scope, schedule, and budget.
The traditional project metrics may not reflect the true picture of project management effectiveness. Effective project management requires the participation and support not only of the project manager and project team, but also of the project sponsors and, where appropriate, the program management office. Because of this, there exists a need to measure the behaviors of the organization from a project management perspective and translate those measures into meaningful results to be used as a means to continuous improvement. Additionally, the results need to be easily translated into meaningful data points for individual performance assessment.
This paper will focus on how Motorola Semiconductor Products Sector's Service & Logistics organization has addressed measuring and understanding the key behaviors required for success in project management.
The Service & Logistics Program Management Office (PMO) decided that the most effective method to measure behaviors was to create a series of report cards. The report cards would be used to measure the behaviors of the three key leadership roles in Motorola's project management culture:
• Project managers
Measuring and understanding the behaviors of the people who perform these functions would identify areas for individual and organizational improvement.
While determining whose behaviors to measure was part of the puzzle, actually developing the report cards became the biggest challenge. A quick benchmark of Motorola project organizations hinted that there were no examples and lessons learned to be leveraged. Fortunately, there was a good amount of objective data available on project manager behaviors. So, it was decided, the project manager report card would be developed first.
The Project Manager Report Card
Prioritizing the project manager report card first made a lot of sense because of the project manager's large impact on the day-today running and overall success on a project. The first step taken to develop the project manger report card was to identify:
• Project manager behaviors that provided a significant opportunity for improvement
• Data that could be objectively measured.
Data from an internal project management information system indicated that project performance goals were not being met. PMO analysis determined that the behaviors around risk and issue management, scope management, and project planning required improvement. But measuring the behaviors associated with these project management fundamentals was not enough. Other project manager behaviors needed improvement, even if their contribution to the overall success of a project was not as apparent:
• Was the project approved?
• Was a financial analysis prepared and approved?
• Does communication with the Project Sponsor happen on a regular basis?
Exhibit 1. Project Manager Report Card
• Are project status reports prepared weekly?
After identifying behaviors to track, a measurement approach had to be developed. The internal project management information system used to identify areas for improvement could also serve as a source for determining if most of the selected behaviors were occurring. This source was objective—a key element that was needed to build a credible report card. Where the internal system did not provide objective data, other sources were identified. If no other data source was readily available then project management processes were modified to provide the necessary data.
Once the areas for measurement and their data sources were identified, a method to present the data was needed. Categorizing the behaviors helped to simplify and organize the report card. The logical categories were Documentation and Communication.
A scoring system was needed. Instead of inventing a new system, the company's performance assessment system scale was used. This provided the report card with a familiar scale and one that could easily be used to provide input on an individual project manger's performance. The scale used in our performance assessment system is:
• 1—Does not meet expectations
• 2—Some improvement needed
• 3—Meets expectations
• 4—Exceeds some expectations
• 5—Consistently exceeds expectations.
Trying to tie the behaviors to this scale initially proved to be difficult. Some of the behaviors were objective and were easy to measure, such as “is a project status report prepared weekly.” But others had subjective elements, such as the quality or thoroughness of a deliverable. Of great concern was how to accurately measure the subjective elements. For example, how should “is the project initiation document detailed and complete including deliverables and benefits/cost” be measured? The conclusion was that because there were variances in types of data being measured the scoring process had to be adjusted to accommodate those variances. The agreed upon scoring process was:
• Objective data that results in a response of either yes or no is scored as 3 (meets expectations) or 1 (does not meet expectations).
• Objective data with a subjective element is scored using the 1 (does not meet expectations) to 5 (consistently exceeds expectations) scale. The PMO reviews each deliverable using best-in-class examples and guidelines to determine the score.
• Slippage is scored according to what percent (+/-) the forecast/actual end date is versus the baseline end date. The Service & Logistics organization established a slippage goal of 12.5% or less. Projects that experience a slip of 12.5% or greater are scored as 1, projects that experience a slip of 0.1 to 12.4 are scored as 3, projects that are one time or early (without a change in scope or cost) are scored as 5.
• Not applicable (N/A) is assigned to deliverables that cannot be scored. This happens only when a deliverable is not applicable due to project life-cycle phase.
Because not all behaviors are scored using the entire scale of 1–5, an aggregate highest possible score of 3.9 was determined. Exhibit 1 represents a typical summarized project manager report card.
The project manager report card is produced and presented to the Service & Logistics management team each quarter. This provides frequent information on project management institutionalization and makes the information available for quarterly performance assessment checkpoints. The PMO takes responsibility for collecting and reporting all project manager report card data. Three experienced project managers staff the Service & Logistics PMO. Although the report cards are presented quarterly, data is collected weekly. One member of the PMO is assigned primary responsibility to collect all data and report it in the report card format. Prior to publication, the PMO meets as a team to review all report card scores. This ensures consistency of scoring and provides an opportunity to consider data points and experiences that might have been missed in the initial data collection. Notations and comments supporting a score are made when appropriate; always when a behavior has an extremely low or high score.
Exhibit 2. Report Card Results Over Time
Once the quarterly project manager report card is produced, it is presented to the Service & Logistics senior management team. Individual project manager data is provided to their managers. Further distribution is at the discretion of the Service & Logistics management team. Summary scores, rolled up to the organization level, are available to the entire organization. The PMO analyzes the data for systemic problems in project management processes. Where appropriate, continuous improvement efforts are initiated. Exhibit 2 provides an example of this summary data.
The initial reaction from senior management and project managers was apprehension; there was wariness over what the objective was in producing this report card. Meetings were held with all project managers before the first report card was published. Today, the PMO makes itself available to discuss concerns about any and all report card scores. If there is a mistake in a score, corrections are made immediately and an updated report card is published. But, it is the objective nature of the data points that has provided the greatest impact on reducing concerns over the report card results. The project manager report card is in its fifth cycle and acceptance of the process and results have matured. There are still concerns from time to time, but as project manager scores improve there is a correlating positive trend in overall project performance metrics. The increased visibility to behaviors and the resulting improvements are reflected in improvements to the traditional project management metrics of on time delivery within scope and budget requirements.
The Program Management Office Report Card
The PMO report card was developed almost out of a sense of fairness. How could the PMO expect project managers to accept the project manager report card if there was no such measurement of PMO behaviors? After all, the Service & Logistics project management philosophy required a solid teaming between the project manager and the PMO.
The effort to develop the project manager report card provided a standard approach to use when developing any report card:
• Determine what data is needed, concentrating on objective data
• Determine how to gather the data
• Develop a scoring system to apply to the data
• Tie the results to the company's employee performance assessment approach.
This approach was used to develop the PMO report card. But, a dilemma was encountered in determining what data to collect. The PMO could not find objective data points on PMO behavior. The internal project management information system used as a source for so much of the project manager data provided no data on PMO performance. And there were no other good options.
Exhibit 3. Example of Sponsor Survey
The PMO decided that to compensate for the lack of objective data, a survey would be developed. The questions on the survey focus on providing the PMO with feedback that can be used for continuous improvement. Additionally, the data is used in the quarterly performance assessment checkpoints. The survey contains a total of eight questions in the categories of:
• Continuous Improvement
A web-based, anonymous survey format was chosen for the PMO report card, which provided easy access across a global organization. Participants in the survey include current and recent project managers, project sponsors, organization staff, and clients.
This survey is released once per quarter. The survey format requests the survey participants rate the PMO against the above noted categories and also provides for additional comments; up to 1,000 characters per question. The guarantee of anonymity is crucial in receiving candid feedback. The results are tallied and comments are captured in a survey response document that is published on the Service & Logistics internal website. The survey has been a powerful tool. The feedback received from the survey allows the PMO to implement improvements to its processes and procedures in a timely manner.
The Sponsor Report Card
Within Service & Logistics, the sponsors make up the third leg of the of the project management triad. While they typically do not get involved in the day-to-day operations of the various projects, sponsors have a crucial role in the overall project management process. Service & Logistics sponsors are as critical to the success of the various projects as the project managers and PMO. The question then became one of how to best measure sponsor performance.
The best measure of effectiveness usually comes from objective data. But, little objective data existed to measure sponsor performance. As a result of this lack of objective data, it became apparent that subjective data would have to be obtained. Collection of data would have to be in such a manner that it would be:
• Easily obtained
• A measurable of those behaviors expected of a sponsor
• Allow for follow-on comments by the respondents to clarify key points
Research was undertaken within Motorola and the Phoenix Chapter of the Project Management Institute to determine if other organizations had a method of measuring sponsor performance. None existed. The common response was “How would you measure such a thing?” Given the lack of benchmarking data, it was decided to pursue a course similar to what was already in place within Service & Logistics and develop a survey based upon sponsor responsibilities.
The first step was to make the survey web-based. This would allow for easy access and provide anonymity for the respondents. This would also allow for a tailored survey that could be answered in a short period of time and further provide for free-formed comments in support of the responses.
The next step was to decide what to measure. Some objective data did exist, but was limited primarily to the number of projects each sponsor was sponsoring, and the average slippage of the projects in their respective portfolios. Attendance of scheduled project reviews by the sponsors provided a third data point for measurement. Considering the role of the sponsor and the importance each plays in the project life cycle, when compared with the amount of data available from the internal project management information system in which project manager effectiveness is measured, three data points were considered to be insufficient.
To overcome this problem, the PMO turned to the roles and responsibilities documents published for members of the project team. Such a document was created for sponsors when Motorola was beginning to implement standard project processes. From this document the various activities and responsibilities of a sponsor were evaluated. This documentation showed that there are three categories of sponsor activities that provide a foundation for measurement:
• Initiation—how involved is the sponsor in defining the project?
• Responsiveness—how involved is the sponsor in project execution?
• Partnership—does the sponsor act as a mentor and provide support in managing customer needs?
A series of five to seven questions was developed for each of the categories. Each of the questions was based upon the sponsor's responsibilities in the prescribed categories. To further validate the appropriateness of the questions, reviews were held with other Motorola PMOs. Numerical scores were assigned to each of the responses, and are aligned to the individual performance assessment scale. A score of 3.21 would be considered perfect.
Project managers, the PMO, and where appropriate, clients, are invited to participate in the survey. The respondents are instructed to fill out a report card for each sponsor they are working with. If they are working with a sponsor on more than one project, the respondents are asked to complete the survey for each of the sponsor's projects.
The sponsor report card has been in place for two quarters. The current data hints at some trends though it is too early to determine if any definite trends exist. Scorings may or may not be dependent on the number of projects in a sponsor's portfolio. A sponsor with a high number of projects may not provide as much attention to the individual projects compared to a sponsor with a low number of projects. As with any implementation of a new tool or process, a period of adjustment is required before the users become comfortable with the approach.
Feedback with some of the participants has been generally favorable with concerns focusing around anonymity of feedback. This has been addressed through the use of an on-line survey with anonymous postings. A partial sample of the survey questions is provided in Exhibit 3.
The following lessons learned will help an organization successfully implement behavior-oriented report cards.
• Base report cards on a set of desired behaviors. Considerable documentation exists outlining the roles and responsibilities of the project team. Identify those behaviors that are appropriate for your organization and baseline accordingly.
• Seek out and use objective data. A report card recipient better accepts results when the scores are based upon objective data. Improvement actions based upon objective data have a higher likelihood of making a significant impact.
• Modify processes to produce objective data where no objective data exists. For example, the project manager report card provides information on project manager and sponsor communication. No data point existed for this communication. To compensate for this, two standing questions were added to the monthly project review meetings:
• Who is your sponsor?
• When was the last time you spoke to him or her regarding this project?
These simple questions provided a wealth of data and understanding on project manager and sponsor communication. More importantly, publicly asking these questions reversed a negative trend in communications behavior.
• Web-enable surveys. It is easier to get user input this way rather than attaching a survey form to an email message. Web-enabled surveys also allow for easy use of drop-down selection boxes.
• Provide ample room for comments to survey questions. Understanding the rational behind an answer helps to identify root causes and possible improvement activities.
• Limit the number of questions on a survey. The sponsor survey has 19 questions with space for comments. While not an overwhelming number of questions, future versions of the report card will contain fewer questions focusing on a wider set of behaviors.
• Translate results into a number. Assigning numerical scores to the responses helps identify behavior patterns. It also makes it possible to track and trend the long-term results.
• Fear of reprisals can dampen the response to a survey. In Service & Logistics, sponsors are often the direct managers of the project leaders. This reporting relationship makes project managers hesitant to answer some of the questions in the sponsor survey. The PMO has taken actions to ensure anonymity of response, including ensuring that several people (e.g., PMO, project manager, key clients) provide responses for each sponsor.
The Project Manager, PMO, and Sponsor report cards have been in place for a minimum of three quarters each. Their implementation and early results are very encouraging. To date, the following conclusions have been made:
• An improved understanding of project management behaviors is leading to focused improvement efforts.
• A positive trend in project manager behaviors is leading to a greater number of projects completed on time, on budget, and to scope.
• The PMO is positively viewed as a partner whom the project leaders and sponsors can turn to for guidance and assistance.
• A flat trend in sponsor performance is suggesting a need for more education and an increased understanding of the importance of the sponsor's role on and impact to a project.
• Continuous improvement must be considered an integral part of the overall project management process, with all members of the organization participating.
• As behaviors improve and processes mature, it will be necessary for the report cards to evolve to the next level of measurement. The Service & Logistics PMO believes that other project-oriented organizations will gain similar benefits by implementing behavior-based report cards.
Proceedings of the Project Management Institute Annual Seminars & Symposium
October 3–10, 2002 • San Antonio, Texas, USA