Abstract
Projects are always affected by multiple events whose impact is difficult to predict. What if tomorrow you won't have enough resources to complete your project? What if one month from now the costs of supplies increase? What if next quarter you do not receive enough funding for your project? When managing projects with risks and uncertainties, project managers often rely on intuition rather than logic and comprehensive analysis to predict the effects of these types of events. Intuitive thinking is often subject to illusions, which can cause mental errors. These predictable mental errors, such as overconfidence, confirmation traps, optimism bias, zero-risk bias, sunk-cost effect, and others often lead to the underestimation of costs and effort, poor resource planning, and other low-quality decisions. The cumulative effect of these decisions may eventually derail a project. Risk engineering provides a way to balance the effects of these psychological illusions. The foundation of risk engineering is quantitative and qualitative risk analysis. Project risk analysis can be complex, but this presentation focuses on a few simple techniques that will help demystify project risk analysis.
Introduction
Starting around 1995, a number of large computer companies, including Oracle and IBM, were involved in ambitious projects. They were trying to develop and market a range of diskless desktop computer devices, which Oracle called a “network computer” (NC). The idea was quite revolutionary: if computers were mostly used to connect to the Internet, they would not require a very powerful processor, a CD-ROM, or even hard drives. Computers could have been much cheaper than regular desktop computers at the time; they could have been priced at less than $1000. Moreover, because the software was installed on the server rather than the NC, the user would not be required to maintain and upgrade it. Customers could have a computer that met all of their needs, for a fraction of the cost. Despite its initial promise, the idea failed to materialize and NCs were not sold in significant enough quantities (Roth, 2009); the reasons why are diffuse: the price of a regular desktop computer fell below $1000, increasing market competition and the software suitable for NCs was not mature despite heavy investments by some software companies in the development of Internet versions of their software. In other words, in order for the NC computer project to succeed, at least three conditions had to be met:
- The price of the regular desktop computer should be greater than $2000 for quite a while – a 50% probability
- Internet infrastructure should be able to handle the high traffic necessary to support software for NC computer – a 50% probability
- Internet-based software should be widely available – a 50% probability
- Finally, people liking the product – a 50% probability
What is the total chance for project success? The answer is 50%*50%*50%*50% = 6.25%. Apparently, the Oracle and IBM executives significantly overestimated the chance of the project success. Essentially, these decision makers view was built on illusions.
Illusions in Project Management
We are subjects of illusions everywhere. Everything we see, hear, touch, taste, and smell can be misinterpreted, and our ability to manage projects is not an exception. In project management, the consequences of irrational choices made due to illusions are considered failed projects. Here are a few classic, textbook cases of failed projects (Hall, 2005):
- 1991: an inaccurate structural analysis for the Sleipner North Sea Oil Platform led to the loss of the platform at a cost of one billion dollars.
- 1995: an overrun of the development of the Denver Airport baggage handling system prevented the airport from opening on time. Fixing the extremely bug-riddled system required an additional 50% of the original budget, nearly $200 million dollars. Confirming that people don't learn from previous mistakes: in 2008, a very similar project at the new terminal in Heathrow Airport suffered the same fate: hundreds of flights were cancelled when the baggage system malfunctioned (BBC News, 2008).
- 1996–1999: several major space exploration projects, including the Mars Polar Lander, Mars Climate Orbiter, and Ariane 5 European Space Launcher, were lost because of various errors.
- 2003: a software bug was determined to be a major contributor in the 2003 Northeast blackout, the worst power system failure in North American history. The failure involved the loss of electrical power to 50 million customers and economic losses were estimated at $6 billion dollars.
Researchers who study such projects found that the main underlying reason for these failures is not earthquakes, pine beetle infestations, floods, or other external factors, which are hard to either predict or avoid. Most projects fail because of errors in judgment.
Here are a few of the most common illusions in project management (Virine, & Trumper 2007):
- Illusion of Control – The tendency of decision makers to believe they can control or influence outcomes over which they have no influence. For example, when rolling dice in craps, people tend to throw stronger for high numbers and softer for low numbers. Sometimes project managers plan projects under the assumption that they can control most processes, which in reality they cannot.
- Confirmation Bias – The tendency of decision makers to seek out and assign more weight to evidence that confirms a hypothesis and to ignore or give less weight to evidence that could discount the hypothesis. This can lead to statistical errors (Watson, 1960). This illusion is related to estimations and evaluations of alternatives in project management.
- Failure to Consider Alternatives – A tendency to evaluate and consider only a single course of action. It occurs when project managers attempt to reduce efforts during the evaluation of alternatives. It is often the result of sufficient information about one particular suggested course of action and insufficient information about alternatives.
- Underestimating long-term rewards and overestimating short-term rewards. This illusion is very common in project planning.
- Focusing Effect – An illusion that occurs when decision makers place too much emphasis on one aspect of an event or process. For example, a software project manager believes the software's quality is associated only with the number of software defects. It reality, the notion of software quality, in addition to the quality of the software code, involves the quality of the documentation, user interface, packaging, and support.
- Zero-Risk Illusion – The preference for reducing a small risk to zero over a greater reduction in a larger risk. Individuals may prefer small benefits that are certain to large ones that are uncertain. Project managers sometimes prefer to avoid a small risk completely rather than significantly mitigate a larger one.
- Premature Termination of Search for Evidence – The tendency to accept the first alternative that looks like it might work.
- Sunk-Cost Effect – The tendency to make a choice, considering the cost that has already been incurred and cannot be recovered (sunk cost). Sunk costs affect the decisions due to the loss-aversion effect. Sunk costs may cause cost overruns and may also lead to investment in a project that now has no value.
- Overconfidence – A tendency to provide overly optimistic estimates of uncertain events. Decision makers tend to set the ranges of probability too low and remain overconfident that these ranges will include true values. Overconfidence is most likely to surface after a series of project successes and it can lead to risk-taking. This is what might have happened with the executives of Oracle, IBM, and other companies when they made a decision to start the NC computer project.
- Optimism Bias – The tendency to be overly optimistic about the outcome of planned actions. This illusion manifests itself in project planning and forecasting. Project managers often overestimate the probability of successful project completion and underestimate the probability of negative events.
Roadmap to Project Failures
Every year, illusions in project management lead to multi-billion dollar losses. A 2002 study, commissioned by the National Institute of Standards and Technology, found that software bugs cost the U.S. economy about $59.5 billion annually or 0.6% of the gross domestic product (NIST, 2002). The same study found that more than one third of that cost (about $22.2 billion) could be eliminated by improved testing. These bugs are not created by nature: animals, volcanoes, and geysers don't develop the software; the problems were caused by the faulty judgments of people.
Here is a typical roadmap to project failure (Exhibit 1):

Exhibit 1– Roadmap to Project Failure
- You have a situation in a project that will require making some choices, and here is an example: The once respected financier, Bernard Madoff, was engaged in a large project in the financial industry and created a pyramid scheme to profit at the expense of his investors (Arvedlund 2009).
- You often have to deal with illusions. In this case, Madoff believed that his successful financial dealings would continue indefinitely, which is a very common illusion. We are often too optimistic in project management and often believe that our employment in a company will continue until retirement, or that our next project will be on budget because the last one was; in reality, these are only illusions.
- These illusions lead to irrational choices. Question: What is the most important decision for a professional thief? Answer: Escape when the proper time comes. However, because of illusions, instead of cancelling his pyramid scheme (whether it was possible to do this, we don't know), Madoff hung on until it collapsed, which was an irrational choice based on illusions.
- Irrational choices lead to major project problems or project failures. Madoff can now look forward to ending his days behind bars. As projects go, this one failed quite spectacularly and harmed just about everyone involved.
Illusions or Intentions
It is important to distinguish between illusions or mental errors and intentions. Danish researcher, Bent Flyvbjerg and his colleagues reviewed a significant number of large projects (Flyvbjerg, 2005). Among them were large transportation projects such as the Skytrain in Bangkok, Channel Tunnel, and the Los Angeles subway; defense projects such as the Eurofighter military jet, Nimrod maritime patrol plane, and F/A-22 fighter jet; oil and gas projects, such as Sakhalin-1; and construction projects, such as the Hannover Expo 2000, the Scottish parliament building, Ontario's Pickering nuclear plant, and many others all over the world. Flyvbjerg also talked directly with people who were involved in the politics of megaprojects, such as famous architects, Frank Gehry and Kim Utzon. What was common about all these projects is that they were significantly over budget and often took much longer than originally planned. For example, the Channel Tunnel between the United Kingdom and France came in at 80% over budget for construction.
Flyvbjerg found that project planners often intentionally underestimate costs and overestimate benefits to get their projects approved. He studied data from the past 70 years and found that cost overruns have not decreased over time. This intentional “manipulation of the books” is pernicious, not only because it leads to cost overruns, but also to safety, security, and other problems.
So, what is the main reason for human mistakes in project management? Honest mental errors caused by illusions or what Flyvbjerg refers to as deception? Are they deliberate errors in project planning, forecasting, and execution? Flyvbjerg said that the answer depends on the project (Flyvbjerg, 2006). In large projects and mega projects in which political and organizational pressures are very high, deception plays a key role; whereas, in smaller projects, in which these pressures are limited, illusions play a greater role.
But here is one important thought about deception. People who are involved in deception are mostly motivated by a belief that in the long run it will benefit society, as in the case of many transportation projects; benefit their company, as in the case of Enron or WorldCom; or benefit themselves. In almost all cases, these beliefs are also an illusion. Projects that are approved based on fraudulent forecasts will, at end of the day, be a net loss to society. If you create a fake report or tell your manager you are performing tests, when in reality you are researching your picks for next week's fantasy football pool, you may be safe in the short term; nonetheless, this is an illusion, because eventually you will need to deal with the problems you created.
Risk Analysis: Make It Simple
Structured analysis of the situation, and particularly risk analysis, help people overcome illusions and can improve their judgment. However, more likely than not, before making a decision, people do not perform any structured analysis or misinterpret the results of the analysis. Complicating matters, sometimes the analysis is extremely complex and results may be incorrect. Even if the analysis is performed and is correct, often, people do not realize its value. As a result, even with highly trained experts, with access to powerful computers running the most advance advanced mathematical models, we still bear witness to the outcomes of so many poor decisions.
A couple of years ago, we participated in a risk management conference for the aerospace industry. One of the presentations, titled “Risk Management for Human Space Exploration,” and drew an especially large crowd. There were a couple hundred engineers, researchers, and students who gathered to learn about managing space exploration risk from a representative of one the largest aerospace organizations. However, the topics did not cover the risks associated with hostile aliens, deadly space debris, or flying too close to black holes; instead, attendees were presented with descriptions of the multiple regulations, procedures, directives, rules, and other documents that regulate risk management in these organizations. It was mind boggling to see how many documents were created by one particular organization for what is really quite a narrow subject and it probably took at least 12 years to write them. Attempting to actually read these documents would be a truly Sisyphean task and only undertaken at the risk of one's sanity. The mere presentation of an extremely compressed version of these documents caused mass lethargy in the audience. In fact, the presenter himself almost seemed to take on the persona of a hypnotist, droning on and on, seemingly intent on putting the crowd in a trance, which may well have happened, for after the presenter ended and the lights turned back on, it was as if the hypnotist had snapped his fingers to bring his subjects out of hypnosis. People wandered out of the presentation with a slightly mystified look, unable to recall many details of the past hour. This is truly not the effect you are hoping to achieve when you discuss risk processes.
In reality, especially when you are trying to establish a risk management process, the process should be relatively simple. Choice engineering should be a main foundation of your risk management processes. Along these lines, you should first look to establish a few unobtrusive procedures that will steer people toward making better decisions regarding risk.
Consider these three issues:
- What events might occur during your project and what would their impacts be?
- What is the probability that they will occur?
- What can we do to either minimize or take advantage of these events?
Many problems occur in projects because, for one reason or another, people fail to ask these questions. When something happens during a project and causes a major problem and you are asked why it happened, most project managers, if they were honest, would say “We just did not think about it.” Risk management guidelines, procedures, and regulations often hide the most important thing about risk management: it is a thinking exercise. So, let's start with these three questions. Later on, when you are more confident with the exercise, you can begin asking a few more questions, such as “What triggered or caused this risk?” “What is the cost of the risk if it occurs?” and so on. The process constitutes qualitative risk analysis. A more detailed statistical risk analysis based on your project schedule, is referred to as a quantitative risk analysis.
Strategies for Dealing with Risks
Let's imagine the following situation. The American public is tired of having lawyers, actors, and professional sports team managers for presidents; instead, because a government is a set of complex projects, Americans elect a professional project manager to run the country. Moreover, due to your demonstrated prowess in the delivery of successful projects, you are chosen to be Project Manager in Chief. In your first major international crisis, you are informed by your National Security Advisor that the Democratic Empire of Lawless Lands (D.E.L.L.) has plans to launch a new computer virus that will destroy all text documents on infected networks. What should you do?
Remember that you need to ask your National Security Advisor the following three questions:
- What may happen during the course of your project and what will be the impact? If the virus is launched successfully onto a national computer network, it will destroy all of the text documents on the infected network.
- What is the probability that it may happen? Your National Security Advisor estimates that there is a 5% chance that it will be successful. To be more exact, it is better to use an actual percentage for probability rather than a verbal description. Why, if the National Security Advisor says the chance is minimal, you may think that it is 1%, and he may actually be implying that it is 10%, which represents a large difference in perception of the risk. State the estimated probability as accurately as possible to avoid this type of confusion.
- What can you do about it? This can be quite a difficult question to answer. As Project Manager in Chief, you have to decide what would be the best risk management strategy, given all the possible outcomes of your decisions.
Your National Security Advisor may give you a few options:
- Do nothing. In each set of choices there is always the option to do nothing. Perhaps it would not be such a bad thing if all the text documents were destroyed; it would certainly reduce red tape and bureaucracy. Unfortunately, the problem with bureaucracies is not the documents themselves, but rather the people who manage them. This do nothing option is called a risk acceptance strategy in risk management.
- Send agents to assassinate DELL's president. This strategy will probably not eliminate the threat, because the president of DELL is not actually the individual who would release the virus but, in theory, it may deter people from releasing the virus. This is called a risk mitigation strategy.
- Develop an antivirus program. This would also be a risk mitigation strategy, because the antivirus is not a 100% certainty and it may take some time to develop it. Essentially, the risk has not been eliminated, only its probability and impact are reduced.
- Let the Canadian Prime Minister deal with it. This is called a risk transfer. Although it is unclear whether the Canadian Prime Minister would take on this risk unless you provided something in return, perhaps eliminating duties on softwood lumber might persuade him, but this would entail political costs. The same happens any time you transfer risk; there will be a cost, because the party it is transferred to will expect some type of payment in return (for example, when you purchase insurance against the risk).
- Decide to discontinue the use of computers and computer networks in the government, and return to using paper and an abacus. This strategy is called risk avoidance. By eliminating the use of electronic documents, we manage to avoid the risk. People tend to try to completely eliminate risk, but as we can see from this example, it is not always the best strategy because it may require significant resources without significant gains.
The only way you, as the President, can select the best risk handling strategy is by performing a more detailed analysis.
How to Build a Rocket or Risk Ranking
What if rocket science was not actually rocket science? If you are not an aerospace engineer or otherwise employed in this industry, here is a simple explanation on how to build a rocket. Basically speaking, rocket design is a fairly straightforward process. At a high level, it requires only engines, fuel tanks, and a pay load. To ensure reliability, you can add many redundant systems, sensors, and reinforce it to ensure it can withstand even the most extreme launch forces. Your rocket would never explode but it would never fly because it would be too heavy. To decide which systems or components will have the most affect on improving reliability, safety, and should be included in the design, engineers must analyze and rank multiple risks. The simple way to do this would be to multiply probabilities on impact. Risks with higher ranks should be mitigated or avoided first.
This type of process is used by engineers at the SpaceX corporation. SpaceX is an American space transport company that builds the Falcon 1 and Falcon 9 rockets and the Dragon series of spacecraft that will be orbited by the Falcon 9 launchers (Exhibit 2). NASA is planning to use SpaceX rockets for resupplying the International Space Station after the Space Shuttle retires in 2010. During the planning of one of the early launches of the Falcon rocket, SpaceX engineers decided to mitigate their ten most critical risks; for all remaining risks, they chose to just accept the most effective strategy. Almost predictably, using hindsight, the launch failed because the 11th-ranked risk occurred (Insprucker, 2008). But were the engineers incorrect in their ranking or should they have chosen to include the 11th risk in their mitigation plans?

Exhibit 2 – Computer Simulation of SpaceX's Falcon 9 and Dragon Spacecraft Lifting Off from Cape Canaveral, Florida
Should We Protect Commercial Airplanes Against Surface-to-Air Missile Attacks by Terrorists?
Do you protect yourself against dog bites? You could wear special Kevlar pants that would be difficult to bite through or, instead, you may opt to carry a T-bone steak with you that you could use to distract menacing dogs while climbing up the nearest tree. Or, do you really put much credence in this at all, being sure you might get bitten? Unless you are a mailman, we doubt that you are taking all the necessary precautions. Why? Because if you had done a risk assessment, you would have probably come to the same conclusion as everyone else around you had: the chance that you will be bitten by a dog is very slight. In fact, this is an illusion. The most recent official survey, conducted more than a decade ago, determined there were 4.7 million dog-bite victims annually in the United States. A more recent study showed that 1000 Americans daily are treated in emergency rooms as a result of dog bites (Dog Bite Low. 2010). In 2007, there were 33 fatal dog attacks in the United States and losses due to dog attacks exceed $1 billion per year, with over $300 million paid by homeowner's insurance. When you are making your choice as to how to deal with a potential dog attack, you intuitively determine that probability and in a less extent impact of the risk. Because probability and impact do not seem very significant, you decide to not take any precautions, except for perhaps staying away from mean-looking dogs.
Here is another example. A few years ago, the government asked decision analysis experts to conduct a research on whether we should install special defensive equipment on commercial aircraft to protect against surface-to-air missile attacks by terrorists. One of the motivations behind this research was a failed attempt by terrorists in Kenya to shoot down an Israeli commercial airplane in December 2002, using shoulder-mounted missiles similar to the relatively compact Stinger missiles used in the James Bond movie, “Licence to Kill” (Exhibit 3).

Exhibit 3 – FIM-92 Stinger Missile Launcher: Can Terrorist Use it Against Commercial Planes?
Here is a brief description of the problem the experts were asked to address. There is a chance that terrorists will try to use such missiles to shoot down planes. The anti-missile technology that they were considering is available for military planes but is very expensive. Can the government justify the cost of installing this equipment on each commercial plane operated in the United States given the potential risk? The researchers first analyzed the chance that terrorists would be able to mount such an attack and then the chance that one of these attacks would actually bring down a plane (von Winterfeldt, 2008). Once this was determined, they calculated the cost in monetary terms if the plane was shot down. Finally, they calculated the costs of installing and operating the missile defense equipment on every plane and found that it would be very expensive —millions of dollars per plane. The researchers concluded that, unless the cost of the equipment was drastically reduced, it would not make any economic sense to install the devices. The results of the study were presented to policymakers who agreed not to require the installation of these devices. The current risk management strategy is to accept this risk, at least for now.
You wonder if a straightforward economic cost/benefit analysis is the right way to proceed when making this decision. What about the costs in human life and suffering and the grief of loved ones? How can these be measured? Well, they can't be measured in any meaningful way, but you have to be able to use some measure to assess and make decisions regarding the risk in a meaningful way. Analysis of the potential loss is a valid approach that will help you to decide on a course of action and this concept is very simple:
- Calculate the potential loss, which is the cost you will have to pay if the risk occurs. For example, as president you are told the potential loss caused by the DELL virus is approximately $100 billion.
- Calculate the cost of mitigation efforts. If you decided to develop an anti-virus program it would be estimated to cost $10 million.
- Calculate the total cost associated with risk: potential loss multiplied by probability of risk plus the cost of mitigation efforts. In our example, it would be $100 billion (potential loss of the virus attack) * 10% (probability) + $100,000 (antivirus development) = $10,001,00,000.
- Perform similar calculations for different risk management strategies. If you decide to transfer the risk to the Canadian Prime Minister, the potential costs in terms of political capital and lost forestry jobs in the United States may make it one of your less advisable courses of action.
Risk Engineering
Bridges across rivers are designed to withstand large floods. But what if there is a massive, once in 100-year flood? Floods like this will probably destroy most bridges, but this is not a design flaw. In fact, it is a part of the construction code. Can a bridge be built to withstand these types of events? Of course it can, but the cost would be prohibitive. Instead of having many conveniently located river crossings with fast-flowing traffic, you would have only a few, and traffic would slow to a crawl. Because the chance of extreme events occurring is relatively small, it is cheaper to rebuild a bridge if it is destroyed than over-engineer it in the first place. Bridge engineers must select the right balance between different risk mitigation strategies to make the bridge cost effective.
Risk engineering involves accepting, mitigating, avoiding, and transferring certain risks in such ways that the final project is cost effective and less risky at the same time. This requires that you analyze different combinations of risk management strategies in a full set of project risks.
When considering risk engineering, what is most important is that it is performed continuously over the course of a project. During the project life cycle, the risk management strategy may change based on new information. The balance between various risk handling strategies will change as well. If, as a result of the unsuccessful SpaceX rocket launch, the 11th-ranked risk is now considered critical for future launches, it must be avoided. However, because all risks associated with this rocket cannot be avoided, the strategy for another risk may have to be shifted from avoidance to mitigation. In the example we provided regarding the surface-to-air missile protection for the commercial airplanes, the cost of such systems may go down, which, in this case, would become a viable response to switching the risk management strategy from acceptance to mitigation.
When Quantitative Risk Analysis is Necessary
John Brokennose is two things: a professional criminal and a poor project manager. He is currently serving time in a state penitentiary for a failed bank heist. He lent some of his tools to his son for a school science project and, as a result, did not have them with him when he tried to open the bank vault. Now, he sits in his cell planning his next project: escaping from prison. He has already created a preliminary plan (Exhibit 4) and following are his planned activities:
- Cut through the bars on the windows: estimate 30 minutes; but there is a 50% chance that his nail file will not be very efficient, which could add an additional 10 minutes.
- Jump from the window and carefully walk toward the outside fence, avoiding discovery by the guards. He estimates that it will take around 15 minutes; however, there is a 30% chance that the guard dogs will be alerted and start barking. Additional evasive tactics will cause a delay of 10 minutes.
- Climb the fence. John has noticed that the guard, T.I. Sherlokholmes, who will be on watch duty in the tower, spends 75% of the time talking on his cell phone to one of his three girlfriends and doesn't pay any attention to the fence during this time. John has to wait an average of 5 minutes until one of the girlfriends calls. However, there is also a 10% chance the guard will unexpectedly get a call from a new girlfriend, which will reduce his wait time by 5 minutes.
- Jump into his associate, Jack Brokenskull's car. Jack will be waiting for John on the other side of the fence.

Exhibit 4 – Prison Escape Plan
The plan is simple, but there is one additional complication. Jack Brokenskull cannot stop his car by the fence for a long period of time, and John cannot wait for the car. The car must be underneath where John is waiting within a 10-minute window. The question is: When should John start cutting the bars to make sure he lands in the car with a 95% probability?
This is an example of a situation in which the question cannot be answered without a quantitative analysis. John Brokenhead must perform this analysis before starting his escape plan. To start with, he has to create a schedule (Exhibit 5) in the form of a Gantt chart; he then draws risks associated with each task as arrows on the Gantt chart. The project has three threats and one opportunity (if a new girlfriend decides to call Sherlokholmes). Gantt charts with arrows representing risks are called event chain diagrams. Threat arrows point down, opportunity arrows point up, which is quite simple and intuitive. If threats or opportunities are related to each other, they are connected by lines. For example if a guard dog starts barking, Sherlokholmes may stop his conversation with a girlfriend. The size of an arrow represents the probability of the risk. Event chain diagrams can significantly simplify risk analysis.

Exhibit 5 – Prison Escape Plan
Now, John Brokenhead should use a software program to perform the analysis. He enters the project schedule and all associated risks, assigns risks to activities, defines their probabilities and impacts, and performs a calculation.
The result of the analysis shows that Jack Brokenskull must wait in the car for John Brokenhead for 22 minutes to ensure that there is a 95% chance that John Brokenhead will not be discovered, which is significantly higher than John originally estimated. According to the analysis, the chance of a successful prison escape is only about 70%. John Brokenhead is very risk averse so he had to abandon the escape plan.
If John Brokenhead wants to increase his chances of escaping from prison, he will have to perform some risk engineering. His prison escape plan includes three risks:
- Nail file doesn't cut quickly enough. Originally, the chance that this risk would occur was 50% and the impact was a 10-minute delay. John believes he can avoid this risk by using a good hacksaw.
- The guard dog starts barking. John Brokenhead cannot do anything about this and must accept this risk.
- Sherlokholmes does not speak with one of his girlfriends. Originally, there was a 25% probability that it would cause a delay of 5 minutes. What if John Brokenhead found an additional girlfriend for Sherlokholmes? This would reduce the probability to 15%.
Now, we can perform the analysis again. The results show that Jack must park near the fence for 15 minutes to ensure that there is a 95% chance that John will cross the fence while the car is there and not be discovered. This is better but still not good enough, because John needs to find a hacksaw and a new girlfriend for Sherlokholmes. Perhaps John could try a different scenario to deal with these risks. He could slip some drugs into the dogs’ food, which would mitigate the barking risk and then he might have some extra time and not need the hacksaw. As part of risk engineering, we recommend performing an analysis using different risk management plans for each risk multiple times to determine the best course of action.
Unfortunately, most criminals do not perform risk analyses before engaging in criminal activities. If they did, they probably would not perform them in the first place. Project managers often follow the same path and ignore performing a risk analysis, despite the fact they have all the tools in their disposal to ensure that they don't expose their projects to unnecessary risk.
Conclusions
Most projects fail because of errors in judgment. These predictable mental errors or illusions, such as overconfidence, confirmation traps, optimism bias, zero-risk bias, sunk-cost effect, and others often lead to underestimation of costs and effort, poor resource planning, and other poor decisions. Every year, illusions in project management lead to multi-billion dollar losses. Simple structured analysis of the situation and particularly risk analysis help people overcome these illusions and can improve their judgment. Risk engineering provides a way to balance the effects of these psychological illusions. The risk engineering process helps to manage project and portfolio risks on a continuous and consistent basis.