Project Management Institute

Are your projects supporting your vision statement?

prove it!

Abstract

Every organization talks about a strategic vision and supporting focus areas. Sooner or later that vision will translate into projects and initiatives that need to be implemented. Most organizations assess the success of these projects based on the time, cost, scope, and client satisfaction and invest a lot in developing the skills of project managers and teams to increase the chances of success, which is a best practice no doubt. However, few organizations have the discipline, time or approach to assess project success based on its initial ROI and/or intended business outcome(s). Organizations must respond to recurring questions from senior management, namely: What is the return on investment (ROI) or: we invested millions—where are the results? The questions seem simple to ask, yet answering them can be unexpectedly complex.

The focus of this presentation is on how to start changing the landscape, and hence expectations, about ROI; moving away from current state of a one number answer toward a collaborative, multiple factor value based on discussions with stakeholders to establish a new paradigm when thinking about ROI. This paves the road for various metrics to come into play (quantitative and qualitative). They become the foundation of a different approach called value of investment (VOI).

Mr. Haddad will leverage how ESI International measures the impact of its project management training programs with clients around the world, as an approach and an example of how to go about putting the case together to measure the impact of a project, program, or initiative on the organization using VOI—not ROI. This approach will include the business case made before the project starts and all of the value driven, often qualitative, measures that are often not in the initial business case as they should be. It is important to note that these financial business cases are made a long time before the project is completed!

Providing a VOI figure does not come easily from one source; rather, it is a combination of various data points that collectively, and hopefully, demonstrate a positive impact—to even the most skeptical audience. Metrics are often company-specific, but hold many common themes. The chosen metric(s) is(are) highly dependent on where an organization is on the journey.

Finally, and to help operationalize this thinking, I will present a client example of how it utilized a180/360 degree online Project Assessment Tool (PAT) that focuses on three areas: (1) business outcomes supporting the vision, (2) adherence to processes, and (3) assessing the competency of the project manager or leader.

Current State of Affairs: What are Some Gaps in the Project Performance Picture?

No matter what industry or sector one works in, there is a strategy that is set in place. This strategy, sooner or later, will turn into a project, program or initiative that needs to be implemented. An organization's capability to implement can often be the difference between a successful or unsuccessful strategy. Over my past 30 years of experience, I have worked with many organizations to help increase their project management capability by doing many things in various areas, such as process/methodology, tools/templates, and developing core competencies in their people. All that work maintained a focus on adding value to the organization. I tried to show that if we could increase the project delivery success rate only a few percentage points, it could mean huge benefit to the bottom line and many other qualitative areas.

The industry has been focused on measuring the success of projects using the triple constraint, along with customer satisfaction. I refer to these as hard metrics. Getting better in these metrics is a must and a best practice with vast impact to the organization. However, few organizations have focused on the business outcome or value of a project based on its intended use and impact to the organization—at least in a measureable, repeatable, and accurate way.

The challenges to this include time; protecting the names of the innocent and maybe even complacency. In all fairness, it is not easy. However, the hard metrics we measure on my projects tell a happy story, yet stakeholders grumble that we haven't met their expectations. How can I gather metrics that will help me affect change at a point where it will impact outcome? Are there stakeholder perspectives that we're missing in our performance picture? I can't tell who is saying what about our projects. How can I turn stakeholder feedback into an internal marketing tool for my project or project management office (PMO)? Can a more rigorous project evaluation approach and the reporting packages be used to engage the broader business stakeholders in projects and to demonstrate success? Stakeholder feedback is part of the project file, but I can't really use it at a portfolio level. Can we make these data actionable for the organization? Do I have a sense of baseline project and project manager performance before I begin investments in performance improvement? How can I track improvements against the baseline?

These are questions we wonder about and they revolve around value; value to the organization that is often in the eye of the beholder. The whole focus is on the value that the hard metrics bring has been the subject of many recent studies and discussions. The intended benefit of the project and the value the project brings is less talked about but a crucial area to focus on.

In ESI’s recent Global State of the PMO: An Analysis for 2013 study (ESI International, 2013), value was front and center and was highlighted as one of the key metrics a PMO can use to improve its impact to the organization, especially if the PMO ventures into value of intended use—meaning, was it realized or not? For example, has the new Order to Cash system achieved its intended outcome of reducing billing time by 50% and customer service time on the phone by 65%? These example metrics were in the original business case, but rarely talked about after the project is done. Your best case scenario is usually focused on the hard metrics and celebrating that success, if you are lucky, because most IT projects have an over 30% to 60% failure rate, depending on which study you read.

A Suggested Proven Approach

Before we get into the solution to the above mentioned challenges, I thought it would be important to leverage an approach that we have used for years to show the value or impact of training. Then I propose that we take this approach and apply it to how we measure the value/benefit of the project itself. This approach is accurate enough, especially if the sample size is large, that we can make decisions from the output.

We all make decisions with imperfect data; yet, many folks want the ‘Holy Grail’ to know the impact of training. But if it is accurate enough and based on sound principles, it answers the question and also balances qualitative and quantitative data points. Instead of looking for the one number answer, it helps you build your case using various qualitative and quantitative data points that answer the question of Value of Investment (VOI), not just the one ROI number.

Most training is measured by course-end SMILE sheets (sometime online), which assess a level 1 reaction—how was the instructor, the material, and the environment? Some organizations go to level 2 and measure knowledge transfer after training. Most stop here and they are left with anecdotal information and hearsay as the main sources of evidence for the application of knowledge (level 3) and business impact (level 4). This is confirmed by many studies we conducted; but one in particular is on learning transfer (ESI International, 2013), where global respondents confirm that the main source of proof for their positive answer to “is this training having an impact?” is from anecdotal hallway conversations!

In collaboration with one of our strategic partners (KnowledgeAdvisors), ESI customized the learning analytics tool to assess the impact of project related training courses. We started by leveraging the same data collection point, namely course end feedback, and instead of using SMILE sheets, we used SMART sheets, where a lot more information is gathered on levels 1, 2, 3, and 4, in addition to job impact and open-ended questions.

Before training starts, a manager sits down with the project manager to communicate how important the course is to his and her development but also how important it is for the company. For the investment to be worthwhile, the project manager needs to apply what is learned. An expectation is set that feedback will be solicited online after the class to gather candid feedback and ask for predictive estimates on level of application, and level of impact on job and business results. Furthermore, the expectation is set that, 60 to 120 days after the class, the manager will come back to the project manager to assess what actually happened.

In addition, the manager will be asked to provide input on the behavior that was observed in the related subject matter. All of this is captured online via the Metrics that Matter™ tool, which is a Software-as-a-Service technology solution.

The output from these data can tell a great story about the impact of training. It estimates the total impact; isolates for training only, then adjusts for bias as these are self-reported data. You will see a picture that starts to emerge that can be colored further with the qualitative comments, especially if you can segment the data into various company demographics such as business unit or geography.

Here are three examples (Exhibits 1 through 4):

Make Decisions with Impact Data

Exhibit 1– Make Decisions with Impact Data

Job Impact

Exhibit 2 – Job Impact

Business Results

Exhibit 3 – Business Results

Other Metrics

Exhibit 4 – Other Metrics

Application of Learning Measurement to Project Value Determination

In order to assist project organizations in having value discussion with sponsors, executives, and other stakeholders in the business, ESI turned to its partner KnowledgeAdvisors that had demonstrated a practical, tactical, and credible approach to measuring the value of learning investments to the business. It was widely acknowledged that if an approach was complex, resource intensive, not repeatable, and did not produce actionable results, it would not be adopted by project organizations. A technology enabled approach was deemed critical and one that could leverage an existing, proven and robust data collection, analytics and reporting functionality would be even better.

In collaboration with industry project leaders, subject matter experts, and the KnowledgeAdvisors team, several areas were determined to be important in demonstrating project value and achievement of outcomes to the business. Since most organizations were already tracking hard metrics, such as performance against cost, and performance against schedule in other tools, the determination of overall value of a project was best told, the team agreed, based on what stakeholders reported about the project. Stakeholders would include customers, the project manager, the project team and those in an executive sponsor role. A sample of the instrument the team developed follows (Exhibit 5):

Sample Instrument

Exhibit 5 – Sample Instrument

The team was unanimous in thinking that stakeholder insight into the achievement of outcomes and the value at the milestones and end of a project were invaluable in improving the overall performance of projects for the business. Two other areas of stakeholder insight that would also contribute to improved project performance were also identified.

First, insight was gathered about stakeholders’ experiences with the project management process throughout the life cycle. The second area of stakeholder insight would come from the perspective on the experience the stakeholders had with the team itself; specifically, what were their observations of the project teams’ behaviors/competencies as they executed the project? As a result, two other instruments, which would become part of a three-part story — Outcomes, Process and Competency — of a project's effectiveness were developed.

Specifically, the process instrument evaluates the following areas:

  • Procurement – Were project tradeoffs (time vs. cost vs. scope) handled in accordance with stated priorities? Were milestone reviews (stage gate) held on a regular basis to ensure the project was meeting requirements?
  • Time – Was a project plan developed and updated regularly, and were project tasks completed according to the schedule?
  • Cost – Was a cost estimate developed at the project's outset and updated throughout execution? Were the budgeted project costs controlled and accounted for effectively?
  • Scope – Was the project scope detailed in the Work Breakdown Structure well defined? Were the work requirements clearly identified throughout the project? Was the project scope controlled and managed effectively?
  • Risk – Were key risks and response strategies developed at the outset, updated regularly, and managed effectively throughout the project life cycle?
  • Communications – Was a communications plan developed and used throughout the project? Did the project manager keep all stakeholders apprised of project status and performance on a regular basis?
  • Conflict – Were conflicts during the project resolved successfully?
  • Team Management – Were team member's roles and responsibilities clearly defined? Were regular progress meetings held to review the project status and deliverables?
  • Organizational Change Management – Did the project effectively integrate management of change techniques to help drive acceptance and adoption?

The competency instrument gathered insight on the experience stakeholders had with the project manager and the team in the following areas (Exhibit 6):

Competencies of the Project Team

Exhibit 6 – Competencies of the Project Team

The three instruments evaluating Outcomes, Process, and Competency would become The Project Assessment Tool.

The tool was developed in a partnership between KnowledgeAdvisors and ESI International. The Project Assessment Tool is powered by Metrics that MatterTM, a talent and development reporting and analytic system. Three hundred and sixty degree data are collected through automated web-based surveys to project stakeholders, providing insight on areas for improvement. Five to 10 questions covering project manager and team member performance, project processes, and project outcomes provide quick results to highlight areas for concern and trends across a project portfolio that can help decision makers adjust plans to maximize organizational project success and deliver better project outcomes.

Early Results from Project Data

In the months since the Project Assessment Tool was developed and released, 146 projects have been evaluated to consider the following three areas:

  1. Did the project deliver against the expected outcomes of its stakeholders? Did the project deliver the desired benefits to the organization?
  2. How did the project team do in delivering the project against the Process Areas in A Guide to the Project Management Body of Knowledge (PMBOK® Guide) (in the eyes of the stakeholders)? Are the organization's project management tools adequate in communicating the project metrics to stakeholders?
  3. How did the project manager and project team members perform against competencies required to execute the project successfully?
  • Customer Service
  • Industry Acumen
  • Personal Effectiveness
  • Subject Matter Expertise
  • Communications
  • Leadership Effectiveness

The projects cut across several industries and ranged from small facilities and construction projects to milestone reviews of large systems integration programs. ESI analyzed the feedback from 463 stakeholders. The results were evaluated on a Lickert-type scale, ranging from 1 to 7; with 7 representing Strongly Agree and 1 representing Strongly Disagree. The findings are detailed below.

Following (Exhibit 7) are 463 results detailing stakeholders’ perceptions of whether their projects delivered against expected outcomes and overall satisfaction with project performance.

Results

Exhibit 7 – Results

The following business impact (Exhibit 8) was reported by stakeholders for the project investments. (Percentages represent the percentage of stakeholders who selected that business result as a project impact to the organization.)

Business Impact for Projects

Exhibit 8 – Business Impact for Projects

Two hundred and twenty-six stakeholders’ perspectives were gathered on their projects’ adherence to the PMBOK® Guide's Process Areas. Note that not every project stakeholder was asked to provide feedback on Process Areas. Some customers and sponsors were asked only to provide feedback on outcomes and competencies of the team for their project (Exhibit 9). This is a setting in the Project Assessment Tool and is unique for each project.

Competencies

Exhibit 9 – Competencies

Eight hundred and seventy stakeholders provided perspectives and insight on their experiences with the project team (Exhibit 10). Specifically, they rated the project manager and the project team members in six competencies that are required to successfully execute project work. Note there are more results than in the previous sections because project team members complete more than one competency assessment for the project — one on themselves, one for the project manager, and additional feedback for other members on the team.

Percentage of Stakeholders Very Satisfied with Team Performance in Competency Area

Exhibit 10 – Percentage of Stakeholders Very Satisfied with Team Performance in Competency Area

More interesting than the overall percentage of all stakeholders who were very satisfied with team performance by competency area, was the analysis the team did on the delta between an individual's perception of his or her behavior on a project compared with the perception of each stakeholder, by role, on the project. The analysis included looking at what an individual perceived about his or her behavior, and what other stakeholders on the project – direct reports, manager, and peer – perceived about the individual.

Table 1 details the percentage delta between what individuals said about themselves for each competency area compared with what other stakeholders said.

Table 1 – Individual vs. stakeholder feedback

Competency to Successfully Execute Project Manager Delta from “Self” Peer Delta from “Self” Direct Report Delta from “Self” Customer Delta from “Self”
Communication -2% -10% -6% -17%
Customer Focus 4% -3% 0% -7%
Personal Effectiveness 0% -10% -5% -16%
Recommendation -1% -10% -4% -9%
Process Expertise 2% -9% -6% -17%
Team Leadership 1% -9% -4% -13%
Organization & Industry Acumen 2% -8% -6% -16%

Also of interest was that only 66% of stakeholders strongly agreed when they were asked whether they would recommend the project manager or one of the project team members to participate in future projects.

How is this insight being used by organizations?

Organizations are already using the results gleaned from these data. Following are some of the ways the data are being used:

  1. Prioritization of Professional Development Investment: What competency and process improvement do I need to invest in first?
  2. Improved credibility of the PMO: “Shameless” self promotion of the business impact and quality of project work in the organization and improved conversations with sponsors and customers
  3. More meaningful conversations around desired business results for project investments and expected outcomes for project work
  4. Data and results to tell a more interesting story of project performance to the business: A more complete project performance picture, which includes a balance of hard metrics and stakeholder satisfaction with outcomes, process and the team.

The Next Steps

The approach and the tool are great practices. Further value can be gleaned by customizing the survey to specific company lingo and business results. Furthermore, the demographics of various projects and even project managers can help analyze the data by comparing the right types of projects with the same levels of project managers.

Lastly, the data here are mostly for projects that are done. Applying this approach from the start of a project, aligning it with the business case for the project, and collecting the data multiple times to keep your finger on the pulse of impact would be areas to focus on.

ESI International (2013). Global state of the PMO: An analysis for 2013 study.

ESI International (2013). ESI Research Report: Applying training and transferring learning in the workplace: How to turn hope into reality.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

© 2013, Raed Haddad
Originally published as a part of 2013 PMI Global Congress Proceedings – New Orleans, Louisiana

Advertisement

Advertisement

Related Content

Advertisement