Project Management Institute

Project risk analysis: How ignoring it will lead to project failures


Lev Virine, Ph.D.

Intaver Institute Inc.

Multiple risks can affect projects in ways that are difficult to forecast. Supplies can arrive late or be of unacceptable quality, resources can be lost or unavailable, and budgets can be reduced. When faced with these types of risks and uncertainties, managers often resort to gut feelings or intuition rather than performing detailed analysis that might improve their ability to understand how these risks and uncertainties could affect their project goals. Using gut feel or intuition as the basis for decision making is subject to our cognitive biases and prone to illusions, which can cause errors. These mental errors are predictable and will often lead to poor estimates for cost and schedule, resource planning and other management decisions. The cumulative effect of these poor quality decisions may eventually have an adverse effect on a project. To avoid this situation, projects require a well-defined decision and risk analysis process. The paper will describe why risk and decision analysis is so important to successfully management projects.


For several decades people have been trying to build a fusion reactor to create electric power. While there has been significant progress in the research, the goal of a commercial fusion reactor remains a dream. Many groups, including the European Union, Japan, China, Russia, Republic of Korea, and the USA, planned to invest $5 billion in the ITER project starting in 2005 (ITER, 2014). The goal of the ITER project was to create a fusion reactor that could be used to generate electric power. In June 2005, this group decided that a 500 MW reactor would be built at Cadarache, in the South of France, with the first plasma ITER reactions planned at the end of 2016. Construction of the ITER began in 2007, but the project ran into many delays and cost overruns. Currently, the facility is expected to be completed in 2019, with the plasma experiments to start shortly afterwards. The reactor is not expected to begin full deuterium-tritium fusion until 2027.

After the early success of space missions -- the first satellite launch in 1957 and the first manned space flight in 1961 -- people foresaw the imminent expansion of human colonies on the Moon and beyond. In retrospect, it all appears a bit optimistic and in the preceding 50 years, with the exception of unmanned missions, humanity has not made any more significant strides in space exploration. In fact, the design of Orion, a new generation of launch, is conceptually very similar to what had been proposed 50 years ago (Berger, 2006).

The time of amateur tinkers working away in their garages or individual geniuses like Tesla performing experiments in private facilities and developing state of the art technology has gone. Technical innovations generally require large investments and long periods to develop. When engineers cannot solve a technical problem, it is often attributed to a lack of technical maturity, or that there has not been enough time and resources invested to solve the issue. Given that resources are by definition finite and scarce, where can we find the additional resources to solve these problems? Can they be found in the existing budgets of our governments or private companies? Private companies always focus on profits, which can come from technological innovation. However, even companies with the resources required for research and development are very efficiently.

While resources will always be limited, we exacerbate resource scarcity through poor decision-making. Effective risk and decision analysis processes can reduce the impact of poor decisions and free up additional resources that would be wasted otherwise. If decision makers in business and government can learn and practice proper risk analysis processes this alone would lead to an acceleration in technology innovation and productivity.

Predictable Mental Mistakes

Illusions and their knock on effects are a part of life. Individuals spend their entire lives acquiring material wealth under the illusion that they are the path to happiness. Leaders start wars because they will be better off, but it makes their lives worse. What is remarkable is that most people are aware that these are illusions but they are still entrapped by them. These illusions are predictable mental mistakes. Project managers underestimate cost and duration of projects, employ improper resources, or do not perform risk management. This is not because they intentionally want to damage their project, but rather they are making common mental mistakes, which can be avoided through proper analysis.

The field of project management is not left unscathed by our predilection to forming illusions. In December 2009, people in Norway witnessed a series of mysterious spiraling blue lights that sparked rumors of UFOs. The spirals were quite spectacular and left thousands of residents in the north of the country baffled. Unfortunately, it was not a UFO (as that would be very exciting), it was the fallout from a failed test of the Bulova, Russia's new submarine-based intercontinental missile. More remarkable, this missile failure was the result of different kind of illusion: a project illusion. From 2004 to 2009, the Bulova failed. Despite enormous efforts by Russian engineers, the missile was not operational for many years and failed seven of thirteen tests in a five year period beginning in 2004 and ending with the spectacular light show over Norway. In an even more spectacular illusion, the nuclear submarines built to carry and launch these missiles were operational. However, without these missiles they are just extremely expensive submersibles. Experts believe that the primary reason behind the Bulova failures was the inability of Russian defense contractors to manage complex projects required to produce these types of weapons. Many projects in many industries have grown exponentially over the last few decades. Projects include multiple external contractors and suppliers, often in different countries; complex designs; and a growing number of risks, including financial, quality, public relations, and environmental. Even with all of this added complexity, many project managers continue making mental mistakes, which cause costly damages.

According to Virine and Trumper (2013), are few of the most common mental errors in project management are:

  • Availability Heuristic – people make judgments about the probability of the occurrence of events by how easily these events are brought to mind. The availability heuristic has inherent biases:
    • Illusory correlations – people overestimate the frequency in which two events occur together. If a project manager analyzes the relationship between two or more parameters (e.g., the geographic location of a supplier related to the quality of their product) the assessment could be wrong.
    • Vividness – people recall events easier that are unusual, rare, vivid, or associated with other events such as major issues, successes, or failures. Ease of event recall may not be associated with actual probability of the event. In project management, assessment of probabilities for project events including risks can be wrong.
  • Anchoring Heuristic – people rely on a piece of information when making decisions. Biases related to the anchoring heuristic are:
    • Overconfidence in estimation of probabilities – people are often overly optimistic with their estimates of uncertain events. People often set ranges of probability too low and remain overconfident that these ranges will include true values. Overconfidence is most likely to occur after a series of project successes and can lead to risk taking.
    • Insufficient adjustment – people “anchor” on a current value and make insufficient adjustments for future effects. Project managers do not allow sufficient adjustment after making an estimation; for example, estimation of an activity's duration or cost.
    • Overestimating the probability of conjunctive events – if an event is comprised of a number of elementary events, the probability of elementary events should be multiplied to come up with the probability of main event. For example, the probability of success for the particular activity is 80%. If there are three activities, the probability will be 51.2% (0.8 * 0.8 * 0.8).
  • Selective Perception – people's expectations affect their perception. Sometimes selective perception is referred to as “What I see is what I want to see.” Selective perception biases are:
    • Confirmation Bias – people actively seek out and assign more weight to evidence that confirms their hypothesis and ignores or underweighs evidence that could discount their hypothesis (Watson, 1960).
    • Premature termination of search for evidence – people often accept the first or one of the first explanations of an event. If there is a project failure, project managers often stop looking for root causes after finding a simple explanation. This bias is related to failure to consider alternatives.
    • Professional viewpoint effect – people look at something according to the conventions of a profession, forgetting any broader point of view. For example, project engineers may look at a project from an engineering point of view and disregard project management methodologies and tools.
  • Representativeness Heuristic – people estimate probability by judging how representative the object, person or event is of a category, group, or a process. Representativeness heuristic biases are:
    • Conjunction fallacy – it is an unwanted appeal to more detailed scenarios. This fallacy can lead to a “preference for details.” If project managers must select one project from a number of proposals, he or she may tend to pick those proposals with the most detail, even though they may not have the best chance of success.
    • Ignoring base-rate frequencies – people ignore prior statistical information (base-rate frequencies) when making assessments about probabilities. For example, when determining the probability that a new component is defective? pProject managers may make estimates based on recent testing where most components were defective; however, he or she may ignore the fact that historically 99% of the components from this supplier have been problem-free.
    • Ignoring regression to mean – people expect extreme events to be followed by similar extreme events. In reality, extreme events most likely will be followed by an extreme in the opposite way or an average event. Project managers should not expect extraordinary performances from a team or individuals for every project because of the regression to mean or tendency to be average.
  • Recognition Heuristic – when people select an alternative they select which item is recognized. The recognized item will be considered to have a higher criterion value (Goldstein and Gigerenzer, 1999). Project managers select project alternatives which are familiar to them, but not necessary the best alternative.
  • Illusion of Control - people believe that they are in control of a situation when in fact they are not. For example, when rolling dice in craps, people tend to throw stronger for high numbers and softer for low numbers. Illusion of control can cause unrealistic optimism of project managers.
  • Loss Aversion - people prefer avoiding losses versus acquiring gains (Kahneman and Tversky 1979). In other words, people are willing to take more risks when they are going to lose something. In project management, loss aversion is associated with risk aversion and risk tolerance when decision-makers evaluate possible project gains and losses.
  • Optimism Bias – people are usually over-optimistic about the outcome of planned actions. Project managers often overestimate the probability of a successful project completion and underestimate the probability of negative events or risks. The optimism bias is also related to wishful thinking and planning fallacy.

The world has changed, but people continue and will continue to make predictable and repeatable mental mistakes. It is the way we are hardwired. These mental mistakes and illusions impose significant burdens on everybody. Here are our suggestions to lessen those burdens:

  • Learn about these mental mistakes or illusions; it will help you recognize and potentially mitigate these illusions.
  • Perform a structured analysis of project information whenever possible.
  • Risk analysis is a critical step in mitigating negative effect of mental errors.

Is Risk and Decision Analysis A Solution for Your Project?

In 1996, NASA selected Lockheed Martin to design, build, and fly its X-33 Advanced Technology Demonstrator test vehicle (NASA, 2001). With a goal of improving the reliability and safety of the space program and reducing the cost of launching satellites into orbit to one tenth the price as compared to using shuttles or conventional rockets, the X-33 was designed to be a reusable spacecraft capable of reaching orbit without boosters or dropping rocket engines and external fuel tanks. The X-33 was intended to act as a platform to test new technologies for a commercial single stage to orbit (SSTO) launch vehicle including a new type of rocket engine (the linear aerospike engine), composite cryogenic fuel tanks, unmanned flight control, and lifting body aerodynamics.

The construction of the X-33 was almost complete with the avionics bay, reaction control system, thruster controller, and landing gear installed; however, the X-33 project was cancelled in 2001. What happened? In November 1999, the composite liquid hydrogen fuel tank failed during testing. The investigation that followed discovered that the composite technology used in the fuel tank was not mature enough to meet the objectives of the program. In response, Lockheed Martin proposed to complete the development of the X-33 by replacing its two composite liquid hydrogen tanks with aluminum tanks. But NASA concluded the benefits of testing the X-33 in flight did not justify the cost as the X-33 would not be able to reach space with aluminum tanks. NASA's investment in the X-33 program totaled $912 million, which was within its 1996 budget projection for the program. Lockheed Martin originally committed to invest $212 million in the X-33, and during the life of the program increased that amount to $357 million. Technical problems prevented one of the most ambitious technological projects in decades from achieving its goals. The X-33 story illustrates a number of the misconceptions related to the risk and decision analysis process.

Misconception #1: The risk and decision analysis process failed because it did result in a successful project

Project failure is not a sign that NASA's risk and decision analysis process failed. NASA and Lockheed Martin took a calculated risk when they invested in new unproven technologies. They built a smaller and less expensive test vehicle to mitigate some of the risks, but they also accepted many risks, as without risk, research and development would impossible. In these cases, risk and decision analysis helped to reduce irrational project decisions and mitigate their negative outcomes.

Misconception #2: Risk and decision analysis is primarily a bureaucratic exercise

Everything can devolve into a bureaucratic procedure. For example, purchasing groceries used to be a quick cash transaction. Now when you go buy your groceries, it involves your credit card, your loyalty card, cards from affiliated loyalty programs, a gift certificate, coupons from the store, etc. In return, the cashier gives you a receipt, your credit card receipt, and some coupons. What used to be a quick exchange of goods for cash is now a web of mind boggling transactions.

Risk and decision analysis does not necessary require additional administrative overhead. Rather, it should be considered as a way of thinking. It should be as simple as possible to meet the particular needs of your organization and your project. For example, if you do not think that quantitative analysis is going to add any benefit to your project, no problem: do not do it.

Remember, that main ideas of the risk and decision analysis are:

  • Identify risk which may affect your projects;
  • Analyze your project and determine the risks would have the most effect on your project;
  • Determine workable project alternatives;
  • Identify which alternative will bring the most value; and
  • Select a course of action and monitor during execution.

Most organizations already do this in one way or another. Unfortunately, sometimes a simple and logical initiative expands to an absolutely irrational level. For example, one way to collect and process information about alternatives for decision analysis is to create a business case. Imagine being in a company and requesting a small inexpensive scanner to support your activities. In the best case, it would be quickly approved and would be in place to help complete the project. However, in some organizations, you may be required to develop a business case to support the line management's approval of the purchase. By the time the good news filters back that he purchase has been approved, the project is finished and thousands of dollars and many hours have been spent on procuring a small scanner.

Companies invest huge amounts of resources establishing business processes that cover software, training, consulting, etc. In most cases, it is good investment for the right causes. Sometimes processes do not work as planned and create a bureaucracy that leads to more spending. For example, Shell spent roughly $1.5 billon implementing a SAP enterprise resource management (ERM) system (Booth, 2005) and the jury is still out on whether this was a fruitful investment. If this occurs, it is important to freeze the implementation and make necessary adjustments, or if things are really dire, scrap the process altogether. This is similar to an investment in the stock market: if you see a stock tumbling, do not wait until is hits bottom, sell it now and limit your losses.

Misconception #3: Organizations require mature project management processes to benefit from decision and risk analysis.

Any company will benefit from risk and decision analysis process in the same way as any person would benefit from having a logical and repeatable process for making important decisions. For example, if you receive a bonus, you have a few options:

  • a)  pay down your mortgage (apparently, this is not a very popular alternative);
  • b)  buy ten new pairs of shoes from Oscar de la Renta; or,
  • c)  gamble everything at the local casino.

You will select an alternative based on your preferences (risk profile) and on what alternative will give you maximum value. The point is that you do not need to hire a consultant, create a sophisticated mathematical model, and produce a large amount of paperwork to make this decision.

To establish risk and decision analysis process in your organization, do it step-by-step. Create common success criteria that can be applied to all of our projects. Identify who and how alternatives with be evaluated. Once you team feels comfortable with the process and you are seeing some benefits, you may add additional steps or layers to the analysis: modeling, quantitative analysis, and so on. When you are comfortable and happy with your process, you may then want to invest in additional tools that will help you with your project risk and decision analysis.


Human mental errors are predictable and will often lead to poor decisions in project management including cost, schedule, and resource planning. Such errors include availability, anchoring, representativeness, recognition heuristics, selective perception, illusion of control, loss aversion, optimism bias and many others. Performing proper project decision and risk analysis processes may mitigate the negative impact of these mental errors. Ignoring decision and risk analysis may lead to costly project failures. Some managers believe the proper decision and risk analysis processes are bureaucratic exercises: they are too complex and unnecessary. However, experience of leading organizations demonstrates that proper decision and risk analysis processes can be very beneficial in project management.

Berger, B. (2006, July 26). Lockheed Martin to uild NASA's Orion Spaceship, Retrieved from

Booth, N. (2005, July 25). A Business Need For Better Intelligence. CRB. Retrieved from

Goldstein, D. G., and Gigerenzer, G. (1999). The recognition heuristic: How ignorance makes us smart. In G. Gigerenzer & P. M. Todd (Eds.), Simple heuristics that make us smart (pages). Oxford: Oxford University Press.

The ITER Project. (n.d.). web page heading. Retrieved from

Kahneman, D., and A. Tversky. (1979). Prospect theory: An analysis of decisions under risk. Econometrica 47: 263–291.

NASA. (2001, March 1). NASA Reaches Milestone in Space Launch Initiative Program; Also Announces No SLI Funding For X-33 or X-34. [Press release] Retrieved from Accessed July 26, 2014.

Virine L. and Trumper M. 2007. Project Decisions, The Art and Science., Vienna, VA: Management Concepts.

Virine L. and Trumper M. (2013). Project Think: Why Good Managers Make Poor Project Choices. Gower

Waston, P.C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12, 129-140.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

© 2013, Lev Virine
Originally published as a part of the 2014 PMI Global Congress Proceedings Phoenix, Arizona, USA



Related Content