Project Management Institute

Risk analysis

how good are your decisions? Part 1

by Steve Pascale, PMP, Louis Troilo, and Carlton Lorenz

IMAGINE THATYOURE the captain of an expedition climbing Mount Everest. You are responsible for keeping up with supplies, monitoring the weather, and keeping your team healthy. Above all, you have to make the decisions that will keep your team together and on schedule, in spite of illness or foul weather. Make a bad call and your team either runs out of food or finds itself in the middle of a Himalayan winter. Now that's accountability! Wouldn't it be nice to have help in making those life or death decisions?

That's what the climbing team of Iltima Thule thought back in 1984, when they took a laptop computer on their ascent of Everest (Portable PC's Peak Performance, PC Magazine, 6 March 1984, p. 60). Their computer kept up with supplies, tracked the team's daily progress and monitored the weather. It also stored historical statistics on Everest, allowing the captain to consider the possible outcomes of his decisions before he made them. While it couldn't predict the future, the Iltima Thule team's computer did help them make very good decisions.


Making good decisions is at the heart of risk analysis. Risk, in its most basic form, is the uncertainty associated with any outcome and can be in the probability of occurrence of an event or in the impact of an event if it occurs (Robert A. McKim, Risk Management—Back to Basics, Cost Engineering, 1992). Quite simply, risk analysis is the interpreting of past events to help make decisions about the future. It involves analyzing potential outcomes and the odds that any one of them will happen. Risk occurs when an event is certain to happen but the outcome of the event is uncertain, or the outcome of an event is certain but the occurrence of the event is uncertain (McKim).

While risk analysis won't predict the future, combined with one's own experience and good judgment, it can help make some strong educated guesses.

Risk and Knowledge. The rationale behind risk analysis is simple: knowledge is power. The more you know about what could happen, the better equipped you will be to make decisions about what to do. Why buy a new car today at full price if you know that tomorrow cars will be on sale for half-price? To the project manager, risk analysis is a way to find out when cars are going to be on sale.

Computer modeling is one way to “practice the future” and surface possible risks. Note: This date probability graph was developed using Monte Carlo 3.0 for Primavera risk analysis software. It displays a 9 percent chance of completing the climb by 12 December 1997. This finish date was calculated using only CPM

Exhibit 1. Computer modeling is one way to “practice the future” and surface possible risks. Note: This date probability graph was developed using Monte Carlo 3.0 for Primavera risk analysis software. It displays a 9 percent chance of completing the climb by 12 December 1997. This finish date was calculated using only CPM.

To the project manager, risk analysis means a better, more realistic project schedule. That's because risk analysis is more than a blind guess about how long a project will take. It is a calculated estimate based on verified data. Analyzing risk can also identify problem areas that have caused delays on projects in the past and may slow things up again. The result is a schedule that is more accurate, and a project that has a better chance of being completed on time.

Risk Assessment Options. When it comes to analyzing risk, managers currently have a number of options available to them. The method that provides the most detailed information, and the one used by the Iltima Thule expedition, is computer modeling. Computer modeling involves the creation of a computer program that performs tasks in a particular order, much like they would be performed in an actual project execution. Inserted into the program are three possible times that may be necessary for the completion of each task: a realistic estimate, a pessimistic estimate and an optimistic estimate. Once the model is determined to be fair and accurate, it is set in motion for a number of iterations. In each iteration the computer randomly selects a time for each task from the range of possible values provided. The program then combines these times, along with other factors, to produce a single iteration of the model.

Most simulations execute a model 150 to 1500 iterations. As the simulation runs, the computer records information on how long each task took and how long the total project took. The information on how the system performed is used to provide probability tables that identify key statistics on the system, like the average completion time, the critical path, and the probability of completion within a specified budget. Tasks that consistently slow up the process are identifiable and can be studied further as potential problem areas. With this information in hand, how well the system works can be analyzed and design improvements can be made before anything is ever built or put into operation.

The Power of Modeling. The argument for modeling over other risk analysis methods is fairly straightforward. Modeling provides the most detailed information to the project manager. Not only does the model highlight critical paths and possible trouble spots, it also provides probability information on how likely a particular outcome is. This information can be used to construct a probability table that can predict the chances of making or missing a schedule. The Critical Path Method, however, uses only one estimate of task completion time, and cannot provide information about the probability of meeting or missing a schedule. Using a single estimate leaves the manager with no room for error in the event that the estimate is wrong. More important, decisions based on single-estimate methods are subject to greater risk.

Both CPM and PERT have their advantages, and may be used in conjunction with a computer model for even more effective analysis. However, when executing the project, the only analysis that those methods can supply is whether or not the schedule was met. On the other hand, as tasks are completed, the network model using risk analysis software can be updated with real information to recalculate the project risk and identify new problem areas. In a March 1995 Project Management Journal article, David Hulett said that “to relate cost and schedule risk together, the schedule must be resource loaded.” On projects where there is a high degree of uncertainty, the additional information provided by modeling cost and resources are crucial to the success of the project.

Getting Started in Modeling. If you are not building the model yourself, the most important step in getting started is to find a simulation planner (the person who will actually create the simulation) that you trust. Having confidence in your planner is critical, because the usefulness of the model depends on how well it was constructed and how accurately it reflects reality. Much of the modelbuilding process involves close interaction between the planner and other members of the project team, so it also helps if you are all comfortable working together. (For questions to ask a computer simulation planner, see Van Norman in the May 1993 issue of Indus trial Engineering.)

With your planner on board, it's time to get started. Here are a few of the most important things to remember.

Be Clear. Make sure that the planner understands what you want out of the model. Discuss your expectations, and ask about the limitations of computer simulation. It may be helpful to draft a written list of objectives with your planner to ensure that you both know where the model should be going. It is very important that both you and the planner understand each other, because if the model is not exactly what you need the information it provides will be useless to you.

Keep It Simple. Simple is better. A more complex model may provide slightly better information, but it will also create much bigger headaches for everyone involved. A simple model is easier to build, easier to test, and easier to modify throughout the life of the project. It also provides results quicker than a complicated model, and usually provides enough detail to meet your simulating needs. The more complex a model becomes, the greater the chances of errors creeping into it. Making decisions based on a faulty model may introduce more risk into your project than doing no risk analysis at all. Remember, too, that once the project gets rolling you will want to update the model to keep a check on your risk. Start out with a complicated model and things can only get worse.

One strategy for maintaining simplicity in a model is laying out single-point, deterministic CPM values for the entire network. Then, only model the major milestones or the areas that intuitively look like they may cause problems, keeping the other components constant. The model will run faster and the chances for error in the model will go down. Assigning probabilities to all 10,000 activities in a large network would be tedious and counterproductive. A better idea would be to concentrate on those activities in areas of the project that have a history of bottlenecking. Once a project begins, analyzing shortterm areas will assist in meeting that project's long-term goals.

Get Good Data. A model is only as good as the data that goes into it. The information used in a model should be as recent, relevant, and precise as possible. The people who actually perform a task provide one of the best sources for good data. A machinepress operator is a much better judge of how long it takes to manufacture a single part than is the vice president of manufacturing. The person on the floor is also more aware of the things that slow a process down, and may see problems in your system that no one else would notice.

Another excellent source of information for a model is old data from identical or similar systems. Why build a model on estimates when you can use actual archival information? Even if a system is only somewhat similar to what you are designing, the data from that system may still be somewhat relevant. Getting the best possible data is the key, no matter from where it comes.

Having amassed a data set for each task, the problem of choosing the best estimate ranges for those tasks still remains. Remember that three values, and their corresponding probabilities, constitute the estimate range for each task: realistic, pessimistic, optimistic.

One method of selecting estimate ranges is the 10–30–10 rule. According to this rule, data should be organized into some statistical order. The value that occurs 80 percent of the time becomes the realistic estimate. The value that is better than the realistic, and occurs only 10 percent of the time, becomes the optimistic estimate. Similarly, the value worse than the realistic that occurs only about 10 percent of the time becomes the pessimistic value.

Another approach to selecting range estimates is by using Baye's risk. Baye's risk is the average risk for a group of values. To calculate Baye's risk, multiply each value by its probability, and then average together the new weighted values. The idea behind using Baye's risk to select a range estimate is to choose three values that, when averaged, have the lowest possible risk.

One of the principal advantages of modeling is the ability to update the model. As you proceed through a project, be sure to insert the actual dates and cost values for tasks as they are completed, and re-run the model. The new information generated will further increase the accuracy of your schedule, and make your decisions even better informed.

Consider Everything. In analyzing risk for a project, it's good practice to consider every risk, even the most improbable. In developing a computer network for a client it may not seem important to consider the chances of a hurricane—that is, unless your equipment supplier is located in South Florida. Other risk factors are so obvious that they may be overlooked. For example, most managers assume that new technology will translate into immediately better results from employees. However, it takes time for people to learn to use new equipment. Estimating increased productivity without considering the learning curve will lead to a disappointed manager and to a project that starts off behind schedule.

Although it is not possible to factor every potential risk into a model, consider as many as possible. It may be wise to factor seasonal weather into a few of those simulations, just to see what happens. A broader range of indiscriminate possibilities will add contingency (extra time) to your schedule but may also lower your risk.

Verification and Validation. Just because your planner is a programming wizard doesn't mean that the model you get will be perfect. It is important that you check the model to make sure that it works based on the principles of logic and regularity.

Is It the Right Model? The first test to run on a model is a verification, making sure that the risk simulation performs somewhat close to anticipated. The criteria for verification are those objectives that you and your project planner agreed upon back in the planning stage. Verification is largely a debugging process performed by the project planner and manager, although it is still a good idea to have other project team members involved in the process.

Make sure when the model is verified that someone other than the creator of the model does the work. A newspaper editor's job is to catch the mistakes that news writers do not see. A model verifier should catch the mistakes that the model's creator may be unable to identify.

Reader Service Number 5120

One useful way for a planner to debug a model is to look at graphical (as shown in the exhibit) and tabular data reports together. However, analyzing a single iteration represents only one run through the system, with only one set of assumptions, distributions, and parameters. A sample of one is typically not enough to pass or fail your risk assessment.

Is the Model Right? Once verification has been performed, the second key test of a model is validation: a check to determine if the model really does simulate the behavior of the project about to be attempted. The easiest way to validate a model is to compare its output to that of a comparable real project. If they are close, then you probably have a pretty good model.

Use common sense when validating. Ask yourself, “Does the output seem probable?” If not, spend some time examining it closer. That is not to say that you should dismiss any data different from that which you would expect, but definitely take a closer look.

Along a similar line, make the planners convince you that the model is valid. The verification process is largely intended to convince the planners that they have built a correct model. The validation process is for the planners to convince the project manager that the model performs as it is designed to perform. Have the planners explain the results plainly, and raise any questions you may have, even small ones. Once they have made a convincing case, accept the model as valid, and be prepared to use the data.

EVERY UNDERTAKING INVOLVES degree of risk, whether it's climbing a mountain or designing an assembly line. For project managers, analyzing that risk effectively is the key to making the best possible decisions about scheduling resources and setting reasonable budgets—computer planning models, Monte Carlo risk assessment, and simulation are tools to help prepare projects for the unexpected. ■

Steve Pascale, PMP, is owner of Pascal Product Advancement in Boston, Mass., which specializes in new product development.

Lou Troilo is a customer support consultant with SAP America. He previously instructed courses and consulted to users of Primavera's project management and risk analysis software.

Carlton Lorenz is a public relations student studying for his master's at the University of Georgia.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI.

PM Network • February 1998



Related Content