Why good project managers are making bad choices

an introduction to project decision-making

Abstract

Project management is the art and science of human interactions performed by one group of people to meet other people's needs. The overwhelming majority of problems in projects are due to the unforeseen consequences of intentional or unintentional human actions. People make poor estimates, forget something, communicate poorly, or make other seemingly small mistakes that conspire together to lead to larger issues. This paper is about how good and experienced project managers make bad choices and what should they do to avoid it. One of the ways to improve quality of project decisions is Choice Engineering. Choice Engineering is a creating of processes or environment in which people would be steered towards making better choices rather than mandating these choices.

Project Illusions

Why do we, humans, consistently make the same mistakes? Take a moment to look at the pyramid on Exhibit 1 (Virine & Trumper, 2013). Which area appears darker: A or B? Area A should look darker, though in fact, both areas are the same shade and color. This is an example of an optical illusion. But not only is our brain subject to optical illusions, we are subject to many different types of illusions due to the manner in which we are wired to perceive the world. If it is so easy to trick our brain using such a simple picture, imagine how easy it might be to become a subject to other illusions due to much more complex situations, like the ones you commonly confront in project management. People consistently make mistakes because of illusions, which leads to faulty judgment and poor decisions.

“Magic Pyramid”

Exhibit 1 – “Magic Pyramid”

Project management processes are established to minimize the negative effects of such illusions and manage projects better. The Project Management Institute's A Guide to the Project Management Body of Knowledge (PMBOK® Guide) defines a framework for such processes (PMI, 2012). The problem is that processes are hard to establish and follow, they cost a lot, and often they do not persist unless great effort is taken to maintain them (Virine & Trumper, 2007). People inherently do not like restrictions imposed on them by these types of frameworks.

In many cases, especially for smaller projects or particular project activities and phases, comprehensive and strict processes may not be necessary. There is a much more inexpensive way to avoid the negative impact of illusions. It is possible to create an environment in which people make better choices without mandating these choices. Think about the common speed bump. Instead of having to waste the police's time monitoring speed parking lots, speed bumps encourage people to make a good choice (in this case, limiting their speed). People choose to slow down not because somebody might give them a ticket, but because it is more comfortable and easier on their vehicles. Project management processes must be policed, but an environment for making better choices can be engineered. Speed bumps are engineered to continuously steer people towards better choices. People use using choice engineering frequently in many different industries including project management, sometimes without even knowing it. In most projects, process policing and choice engineering work together quite successfully.

Why Don't People Perform Even a Simple Analysis?

On September 15, 2008, Lehman Brothers filed for Chapter 11 bankruptcy protection following the massive exodus of most of its clients, drastic losses in its stock, and devaluation of its assets by credit rating agencies. Why did one of the largest and oldest financial firms with $691 billion dollars in assets collapse so rapidly? Superficially, we have been told that their heavy investment in subprime mortgages and associated derivatives were the catalyst that set off the fall of Lehman Brothers. But how did their army of highly educated MBAs and powerful financial models fail to foresee this risk and communicate the threat to the decision-makers at the helm of Lehman Brothers and other related financial institutions to do something about it? Sadly, the truth is that the senior management of Lehman Brothers, particularly CEO Richard Fuld, were well aware of the subprime mortgages problem, having being warned on multiple occasions, but they deliberately chose to ignore these warnings. Moreover, the management carried on a campaign to silence individuals who talked about these risks (McDonald & Robinson, 2009). Was this arrogance, ambition, greed, or something else?

Lehman Brothers worked within a framework of government regulations. Government, in this case the Federal Reserve, is supposed to ensure that financial crisis like the subprime meltdown should never happen. Did they (the Federal Reserve) see the danger in the type of financial practices associated with sub-prime mortgages? Apparently yes, but for a long period of time they believed that the problem associated with subprime mortgages would be localized and could not bring down the entire economy (Wessel, 2009). Macro-economic analysis is not trivial calculation like simple arithmetic, but surely the Federal Reserve with its significant resources, expertise, and mandate to oversee the economy would be able to foresee the unintended consequences of the financial decisions that were being made by the major US financial institutions. As it turns out, they did make mistakes and there are at least three reasons for this.

In complex situations when potential issues are identified, it is generally obvious that an in-depth analysis would help decide on a proper course of action. Low quality decisions are usually the result of:

  1. No or insufficient analysis. This is common in many projects, but not in the case of Lehman Brothers and Federal Reserves.
  2. The analysis is partially or completely incorrect. In our example, the analysis was probably partially correct. The economists in both the Federal Reserve and Lehman Brothers create very complicated mathematical models; however, these models often cannot account for novel or emerging economic processes, in this case the combination of derivatives and subprime mortgages.
  3. Decision-makers amend, ignore, misinterpret or overwrite results of the analysis. This is what mostly likely happened at Lehman Brothers.

Financial organizations, like Lehman Brothers, as well as the Federal Reserve are not run by computers (though given recent events it may be not so outrageous an idea), they are run by people who have the discretion on whether or not to accept the recommendations that come from an analysis. As we learned before, people's perception of reality is subject to illusions. People are often under the illusion that analysis is either not necessary or their judgment is better than the direction provided by the analysis. Here is a paradox:

  • We (humanity) consistently fail to make the best decisions given circumstances because we are subject to mental errors.
  • To uncover these mental errors and see the correct path, we need to perform some sort of analysis.
  • Unfortunately, we often do not perform sufficient analysis because of the mental error that following our own intuition will lead to a better outcome. In other words, we fail to overcome mental errors because we are subject to yet more mental errors.

This leads us to the question, “What types of mental errors make people ignore and misinterpret the results their analysis?”

Overconfidence

Russian real estate developer Shalva Chigirinsky had extraordinarily ambitious project plans. One of the richest men in Russia, with a net worth $2.8 billion dollars in 2008, Chigirinsky had a program to build multiple shopping centers and towers (Forbes, 2008). In particular, Chigirinsky planned to build a huge hotel and entertainment center just next to the Kremlin in Moscow. The complex was designed by the famous architect Norman Foster. In order to free space for the complex, Chigirinsky's company demolished a 3200 room hotel. In addition, Chigirinsky wanted to build the tallest skyscraper in Europe, which was to be called the “Russia Tower.” Shooting more than six hundred meters above ground, the tower would top out at 118 floors (Ermakova, 2007).

Unfortunately, there is now scant visible evidence of Chigirinsky's ambitious plans. The aftermath of these failed projects is chiefly an empty field across the street from Red Square and an empty lot where Russia Tower was supposed to stand. The unfortunate Chigirinsky currently finds himself in the main protagonist in a series of lengthy court battles. We believe the root cause of Chigirinsky's failure was his overconfidence. At the beginning of his career, he was a very successful businessman and project manager and, as his businesses grew, his confidence grew. At some point, his confidence, which was one of his great strengths, became a source of weakness. He became overconfident in his abilities to manage enormous projects.

Shalva Chigirinsky made at least three mistakes, which are common in project management:

  1. He was too confident that his business connections would open him any doors and these doors would remain open for a long time. In particular, he had befriended Moscow's mayor Yuri Luzhkov, who was the de-facto “czar” of the city. But nothing is forever; relationships can go sour. Yuri Luzhkov was dismissed by the Russian president in 2010.
  2. Chigirinsky was also overconfident about his own resources and money-raising abilities. Overconfidence in quality and quantity of resources, both human and financial, is one of the most common mental errors in project management.
  3. Chigirinsky's overconfidence led to an active ignorance and underestimation of different potential risks, particularly the risk of a financial meltdown.

Overconfidence in decision-makers is one of the major reasons why analysis is either not performed or the results of analysis are ignored. It is one of the most common biases in project management.

Here are interesting some facts about overconfidence (Plous, 1993):

  1. Overconfidence is independent of intelligence. This means that billionaire Shalva Chigirinsky or a bottle picker, a grade one drop-out may both have the same level of overconfidence (Lichtenstain & Fischhoff, 1977). The real difference is that if Mr. Chigirinsky has underestimated the required project cost it may result in losses of millions of dollars; if a bottle picker overestimates the number of bottles he might find, he may be short a few dollars.
  2. More information does necessarily improve the accuracy of our decisions, but may significantly increase our level of confidence. Practically, this means that more you learn about a subject, the more confident you will be about your judgment regarding that subject, but your decision still may be incorrect (Exhibit 2). Managers can have many years of experience in an industry, but still can make poor judgments. This is a very common phenomenon with executives and project managers.
  3. Overconfidence is not destiny and can be moderated. If people receive regular feedback regarding the results of their decisions, over time they will exhibit little or no overconfidence. For example, professional bridge players or weather forecasters are less overconfident than project managers who manage different types of projects.
  4. If you ask a person to explain why their decisions may be wrong, and get them to play devil's advocate to themselves, this will reduce overconfidence (Plous, 1993). For example, if Shalva Chigirinsky was asked to explain why he wanted to invest money in such large projects, he might have rethought his decision and rescind his development plans. Answering questions or understanding an opposing perspective might have pushed Chigirinsky and others in his position towards a more balanced and perhaps better analysis of a problem.
Accuracy vs. amount of information

 

Exhibit 2 – Accuracy vs. amount of information

Confirmation Bias

You have arrived in Lisbon, Portugal for a vacation. While walking along the street you hear quite a bit of English being spoken. Because of this, you start to believe that at least half of Lisbon population speaks English. However, your assessment is incorrect: you simply pay more attention to English-speaking people than the others on the street. This effect is called selective perception or “I see what I want to see.”

One manifestation of selective perception is the confirmation bias. We cannot know what Lehman Brothers’ CEO Richard Fuld was thinking when he steered his company into a program of risky securities investments that originated in subprime mortgages. But he may already have had the pre-conceived idea that an investment in subprime mortgage derived securities was the profitable or sound choice, so he might have tended to dismiss evidence that these investments were too risky. In particular, he did not listen to his employees who warned against this strategy. At the same time, he may have put too much weight on evidence to the contrary. For example, because other financial institutions were involved in similar investments, this confirmed his theory.

Confirmation bias can lead to frustrating consequences. For example, confirmation bias is one of the reasons why people are obsessed with conspiracy theories. Did men actually land on the moon? There are those who point to evidence that it did not happen. Examine this picture of Apollo 11 (Exhibit 3). There are no apparent blast craters or any sign of dust scatter in the 16 mm movies of the landing. Conspiracy theorists believe that this confirms their suspicions and the movies and images of lunar landings were staged on a sound studio located in a secret government facility, similar to the manner that the war with Albania in the movie “Wag the Dog” was shot in Hollywood. In reality, due to the way that a lunar module operates, it does not create a blast crater. If you have more such ‘evidence’ and you want to ignore the vast amount of evidence that men actually did walk on the Moon, you are probably a hard-core conspiracy theorist.

Lunar Module

Exhibit 3 – Lunar Module

Another detrimental result of confirmation bias is that it can be used in other ways: if you manage to derail your project, try coming up with a conspiracy theory that plays to the preconceptions of your managers. Point to evidence that suggests malfeasance on the part of your competitors, previous management, or poor plan alignment if one of the managers has recently mentioned it. With persistence you should be able to convince management that the issues with the project are not your fault, even there is a lot of evidence to the contrary.

Confirmation bias is one of the reasons that people do not perform a proper analysis. Why go to all the additional effort to analyze a situation if you already believer that investing in subprime derivatives is the way to go?

Optimism Bias

Are you an optimist or pessimist when you are considering the possible consequences of your project plan? Physiological research shows that most people are over-optimistic about the outcome of planned actions (Armor & Taylor, 2002). It is called the optimism bias or planning fallacy. For example:

  • Second-year MBA students overestimated the number of job offers they would receive and their starting salary.
  • Most smokers believe they are less at risk of developing smoking-related diseases than others who smoke.
  • Most newlyweds in a US study expected their marriage to last a lifetime, despite being aware of the divorce statistics.

In project management, optimism bias affects estimations on many things. For example, professional cost estimators consistently underestimate costs of their projects. Here is another problem. Even as a project nears a deadline and cannot be realistically completed on time, optimism bias pushes manager to report that project will be completed as planned.

We are not implying that optimism is a bad. Most of mankind's greatest achievements were entirely dependent upon someone's abundant optimism that they could overcome insurmountable odds. Without optimism, there would be no persistence. Who would risk starting a new business or become married without it? At the same time, optimism bias can lead to major blunders much in the same manner that overconfidence or confirmation bias can lead to a lack of analysis. Optimism bias also manifests itself when we underestimate project cost, duration, and available resources. Perhaps Richard Fuld was overly optimistic about the outcome of investments in subprime mortgage backed securities.

Optimism bias is a mental error, which is very hard to overcome. For example, each time we go on vacation we spend significantly more than we plan, regardless of whether we know of this bias or not.

Analysis is Not Trivial

What causes more greenhouse gases: using paper towels or hand driers? It is extremely difficult to tell conclusively, though some have tried. What are all the factors that we would have to take into account? Greenhouse gases are emitted during the production of paper, electricity, and the hand drier itself. How much depends on different conditions: there are different types of paper and hand driers, electricity can be produced from different sources, and human hands come in many different sizes. How were the towels or dryers transported to their current location? We have just barely scratched the surface and already this question has become quite complex.

Since analysis can be very complex, it creates an opportunity for intentional or unintentional misinterpretation. Here is another example. The Canadian company TransCanada is planning to build the Keystone Pipeline System from Alberta or Oklahoma and then to Texas to bring Canadian oil to U.S. refineries (TransCanada, 2012). The cost of the project is estimated to be approximately $7 billion. Environmentalists are strongly opposed to this project for two main reasons. First, environmental groups are trying to curtail further development of Canadian oil sands, which they consider particularly “dirty,” as these projects have a higher level of greenhouse gas emissions for each unit of energy produced. Second, the pipeline is supposed to cross the environmentally sensitive Sandhills region and huge Ogallala Aquifer, the prime source of drinking and irrigation water for Nebraska and surrounding states. Environmentalists organized massive protests against the project, including staging a demonstration near the White House. Eventually, in 2011, the U.S. government decided to postpone the project to select a better route, even though its own regulatory authority had approved the original route.

Is this project really bad for the environment? Potentially, yes. There are always risks that a pipeline can leak to oil into the aquifer. Oil spills from pipelines happen on a regular basis in the U.S. and Canada. On the other hand, TransCanada has pointed out that there are currently 200,000 miles (320,000 km) of pipelines in the U.S. and the Keystone pipeline proposes to be most technologically advanced pipeline ever. Moreover, if a pipeline uses a longer route, the chance of the leak or spill would increase. In the absence of an oil pipeline, producers will use alternate means to ship their oil and this will increase the risk of environmental damage, as pipelines are significantly less risky than any of the alternatives. So it looks like efforts by environmentalists may be counterproductive and lead to more environmental problems. The question is, what is the truth? Who is right: the environmentalist or the oil industry? Without undertaking a very complex analysis, people on both sides will make multiple mental errors. Environmentalists tend to rely on vivid descriptions of potential leaks often told by celebrity spokespeople who do not have access to any reliable data. On the other side, we have extremely confident (overconfident) industry experts who are convinced that the planned safety measures will be sufficient.

Therefore, you hear all sorts of different assessments that are often contradictory, even though they all seem to be coming from reputable sources. The main problem is that even very advanced analysis of a project can give misleading results. That is why so many people, including perhaps Richard Fund, do not believe it is worth the effort.

 

The processes versus mental errors

The number of doctors per capita in Russia is significantly higher than in the US: 4.25 per 1000 people vs. 2.3 per 1000 people based on 2002-2003 data (Nationmaster, 2010). In most cases, Russian doctors are as qualified as physicians in Western Europe and North America. At the same time, the quality of health services in Russia is significantly lower than in these countries. There are many reasons to explain this difference: relative lack of equipment and medicine is certainly a major factor. But perhaps a more fundamental explanation is the absence or poor implementation of standard medical processes. For example, after cleaning a floor, a nurse may go directly to assist with the delivery of a baby without first washing her hands, or a doctor may perform surgery after a night of heavy drinking. The fundamental reason for these problems, as we have already learned, are mental errors. The doctor is under the illusion that he can successfully remove an appendicitis despite his hangover. Because of mental errors, the doctor makes a poor decision.

There are processes that could mitigate this and other similar situations. For our example, a campaign could be conducted to educate surgeons that, contrary to their own beliefs, drinking a bottle of vodka prior to performing surgery will not improve their performance, and is not only detrimental to their health, but that of their patients. Hospitals could routinely institute sobriety checks, or require the surgery team to undergo a quick breath analyser test before surgeries are performed. A process could be put in place to contact replacement surgeons if they are required and so on.

The PMBOK® Guide is an accumulation of the experience of hundreds of project managers and defines the most important project management processes. If these processes are followed, it should significantly improve the performance of the organization. The problem is that implementing and maintaining these processes is hard work.

What Is Choice Engineering?

Minnesota tax officials conducted the following experiment. Groups of taxpayers were given four kinds of information:

  • Group 1 was told that their taxes will go towards paying for services that they generally approved of: education, policing, etc.
  • Group 2 was threatened with punishment for non-compliance with the tax system.
  • Group 3 was provided information on how they could find assistance for filling out their tax forms.
  • Group 4 was told that 90% of Minnesotans had already properly completed their tax returns.

So which group was most likely to submit a correct tax return on time? If you answered Group 4, you are correct. As it turned out, the other interventions had little or no impact on tax compliance. This study points out that people are more likely to follow certain rules if they believe that other people are following them as well. Providing the information that most people were complying with the tax system essentially created an environment in which people made better choices. Without instituting a strict process or threatening penalties, people were encouraged to make good choices themselves; the process helped to steer them towards a better choice, but without restructuring or eliminating their freedom of choice. The original idea was suggested by Richard H. Thaler and Cass R. Sunstein (Thaler & Sunstein, 2007).

One of the simplest examples of choice engineering is a checklist. Commonly, when you need to fill out a number of related forms, you are also given a checklist that allows you to check that you have filled out and included all of the required forms. You can use the form or ignore it at your own peril, but most of us will chose to refer to it. Alternatively, you could choose just to penalize people who fail to complete the forms properly. You may get compliance, but it would be short-lived and resentfully given. It is much simpler and effective to provide a simple checklist.

As for our counterfeiting project, if your organization is relatively small and uncomplicated, you might instead provide a simple list of common risks rather than a comprehensive risk management system. There will still be a process, but with much less strict rules. For example, this risk list will appear each time shipment information is entered into the computer system, which will encourage your employees to think about the risks and hopefully address them on a regular basis. What is most important is you don't mandate the use of this risk list. You create an environment in which people can use the risk list in an easy and relatively unobtrusive manner by applying a very few simple rules.

Policing vs. Choice Engineering

People make bad choices because they are affected by certain mental errors and are unable to correctly analyse situations. Both policing and choice engineering will help people perform make better choices. However, policing entails a significant restriction in choices and it is the freedom of choice that is the main lubricant in society. Without freedom of choice, projects, technology, and society would gradually grind to a halt. Therefore, it is in our best interests to provide a framework that allows freedom of choice while encouraging choices that are in their best interests.

For example, how can we minimize smoking? The government could try prohibition, but as experience has shown, this tends to foster black markets and criminal enterprises. In this example, policing will be a very unproductive approach and most people understand this. Instead, governments and health organizations have turned to choice engineering. Choice engineering entails limiting smoking to specific areas, restricting tobacco advertising, increasing the price of tobacco products, etc. People can still choose to smoke, but the cumulative effect is an environment that is not very supportive for smoking. Most important in choice engineering is that it must be structured around human psychology. For example, messages that are conveyed graphically have a greater effect on people's choices than verbal messages. Graphic messages are used extensively in Canada's anti-tobacco campaign. All cigarette packs include very graphic images of diseases caused by smoking – cancerous lungs, ulcerous sores, etc. Not pretty, but effective. Some other ways to fight tobacco addictions happen to be less effective because of psychological reasons. Therefore, choice engineering must be founded on good knowledge and understanding of human decision-making.

Here is another advantage of choice engineering. The more rules we create, the more opportunities there are to break these rules. Since there are very few rules in choice engineering, there is a greater chance that these rules will be followed.

When Policing Is Necessary

In large projects, estimators and managers often consciously or unconsciously provide false estimates and other incorrect information to get projects approved or advance some other agenda. In these situations (large and complex), it is important to set up a clear set of rules, which must be followed.

For example, NASA and other national space agencies have defined multiple processes for risk management for the International Space Station. Obviously these processes are comprehensive and have vigorous controlling procedures. One of these processes is specifically designed for protecting space stations against meteoroids and other orbiting debris. A document of more than 60 pages prepared by dedicated committee outlines in details how to mitigate the threat (collision warning and shielding), what to do if an impact has occurred, and most importantly, who is responsible for what (Committee, 1997). For projects like the International Space Station, policing is critical to its success.

In each project, there is space for policing and choice engineering. In large projects, where the role of deception plays a significant role in poor decisions, policing should play a major role as it would be difficult to eliminate deception by choice engineering. At the same time, in smaller projects the role of mental errors in creating bad choices is much more prevalent. Choice engineering is an effective tool to drive people towards better choices.

Prior to 2008, most governments failed to regulate some of the more complex activities associated with securities, particularly derivatives and their trading. Regulators relied mostly on choice engineering: they thought that the market would provide the corrective mechanisms to punish poor choices. After the financial crisis of 2008-2009, more rules were introduced and more rules are still considered: the ratio of policing vs. choice engineering in this area is continuously shifting towards policing. At the same time, a significant amount of freedom of choice for financial managers still remains. Completely removing the freedom of choice would be the equivalent of moving to a socialist or centrally managed economy.

A Few Ideas Regarding Choice Engineering

Here are few simple things you can do in your project to establish choice engineering:

Checklists and templates

These are the simplest tools for choice engineering. No complicated procedures, just checking a few check boxes to ensure that they have not forgotten anything. If you want people to follow a risk management process, don't ask them to memorize Chapter 11 of the PMBOK® Guide. Just provide risk identification template with a few predefined standard risks. Limit paperwork. A rule of thumb is as much as needed, as little as required.

Full disclosure

Perhaps you have been asked to participate in the development of new software product that is estimated to take one year to commercialize. You have a few questions. Who will buy it? How much will it be sold for? How will potential clients use it and what are the proposed benefits? You discover no one on the development team knows much about it. Management probably knows the answers, but has not passed that information onto the development team assuming that they don't need to know this information; it is not critical to the performance of their job. Would you take on this project, would you enjoy working in this environment?

Without this information, a key motivational factor is missing – why are we doing this project, and what value to our clients does it represent? Without this essential knowledge, you might as well be digging one hole to fill another. The thought is not inspiring. Therefore, it is incumbent upon management to tell or “disclose” to their project team as much information as possible. Further, they should always take steps to ensure that this information reaches and is understood by the project team. Often, organizations simply dump information onto the company website on some obscure URL and then claim that everything has been provided. Full disclosure is a type of choice engineering that addresses many problems with the projects, particularly the disengagement of project team members who do not feel any ownership in the end product and just work from 8 to 5. Full disclosure pushes people toward better choices without enforcing them.

Auditing and indirect control

Choice engineering does not mean that there are no processes or control. What is distinct about choice engineering is that for many projects there is no practical or cost-effective way to implement comprehensive controls on project management procedures. Instead, choice engineering relies on audits and indirect control of project management processes. At issue is the fact that if people manage to make irrational choices (e.g., take unnecessary risks and get away with it) they may continue this behavior in the future. Audits can prevent this effect. Audits and indirect controls performed on a regular basis will help to ensure that people continue to make rational choices. For example, as a project manager you need to ensure that your team follows the project plan. Regardless of whether you use policing or choice engineering, following the project plan is necessary.

You may ask your team members to enter their daily project related activities into some sort of project management system. This would be policing and may be perceived as being a bit heavy handed. Alternatively, you can schedule regular project status meetings in an informal environment where everybody would briefly report on their recent activities and progress. These meetings would encourage people to think on their answers and, as a result, they will try to align their activities with the project plan. This occurs not because there is an explicit procedure to stick to the project plan, but because there is an environment that motivates people towards performing congruently with the project plan.

Competition

Competition between different organizations helps to create better products. Competition between different project teams and even between different groups or people within a project team will foster an environment that pushes people to do more analysis, which leads to better choices, and eventually helps them find better solutions.

Here is a true story. Two crews constructed two parallel subway tunnels in Minsk, the capital of Belarus. They started at the same time and both tunnels were identical. After month or two it was discovered that one crew was constructing the tunnel faster than the other. The technology and machines were the same, crews had similar qualifications and experience, and they were using the same materials. If only one tunnel had been constructed, nobody would probably bother with trying to perform any detailed analysis of productivity. But in this case the question was raised: Why? After some investigation, it was determined that core reason for the difference in construction speed was not any objective factors, but poor project management and particularly human management practices used in one of the crews.

Make process a habit

At one time, the Greyhound Bus Line safety record was not perfect: the number of accidents was higher than its competitors. To fix this problem, Greyhound introduced very strict safety procedures. It was very hard and costly to establish them, follow them, and monitor them. For example, all employees, including office administrators, had to submit a fixed number of safety observations or warnings into the corporate risk management system every week. Naturally, people complained, as hazardous situations were relatively infrequent and almost non-existent for office workers. To meet quotas, people started to enter frivolous entries into the system: one wrote that if a person enters a kitchen too quickly or without warning, someone might accidentally hit them by opening the refrigerator door. Another entry made notice of the hazard of people colliding with one another when turning the corner of their office corridor. After a month or two, management dropped the requirement to submit these entries and it became voluntary. The upside of this was that people were now conditioned to know that if a real hazardous situation presented itself, they could use the system to immediately report it. In this case, a small “policing” exercise helped to create a habit: policing transformed to choice engineering.

Education

Project management education and training with focus on decision analysis and human psychology is an important choice engineering tool. Say you decided to cheat on your taxes (just a little bit). The reason why you think that you will be fine is that you do not know how many people have actually been caught. This is a very a common mental error in which people make choices based on incorrect assumptions or incomplete information. Instead of checking statistics regarding the rate at which tax evaders are discovered, you instead rely on your gut feelings or intuition. In reality, tax authorities have quite a good record at discovering tax evaders, including small ones. If you learn about different types of mental errors, it will help to minimize the chance that you will be a subject of these mental errors.

Conclusions

Structured analysis of a situation helps people to overcome mental errors and can improve their judgment. However, more likely than not, prior to making a decision people have not performed any structured analysis, or they misinterpret the results of the analysis. Complicating matters, sometimes the analysis is extremely complex and results may be incorrect. Even if the analysis is performed and is correct, often people do not realize its value. As a result, even now with highly trained experts with access to powerful computers running the most advanced mathematical models, we still bear witness to the outcome of many poor-quality decisions.

As we have shown, people often make poor choices because of mental errors. At the same time, they don't perform any analysis that would improve their decisions because of other mental errors to which they are subject. Is there a solution to this problem? Establishing effective processes is always considered a good way to improve project management. For example, if a project manager follows mandatory guidelines for time, scope, cost, risk management and other knowledge areas, this should improve the quality of the decisions made during the execution of the project and reduce chance of failure. But such processes are hard to implement, often expensive, and followed grudgingly (if at all) by some team members once they have been introduced. In many cases, especially for smaller projects, it would be more beneficial to create an environment within which people are encouraged of their own volition to make better choices, rather than to mandate these choices. This is called choice engineering.

Armor, D. A., & Taylor S. E. (2002). When predictions fail: The dilemma of unrealistic optimism in heuristics and biases: The Psychology of Intuitive Judgment, eds. Gilovich, T., Griffin D., Kahneman D. Cambridge, UK: Cambridge University Press.

Committee of International Space Station Meteoroid/Debris Risk Management. (1997). Protecting the Space Station from Meteoroids and Orbiting Debris. The National Academies Press

Ermakova, M. (2007). Transcendental “Russia.” Rosiyskaya Gazeta. Oct. 2007.

Lichtenstain, S., & Fischhoff, B. (1977). Do those who know more also know more about how much they know? Organizational Behavior and Human Performance, 20, 159-183.

Nationmaster. (2010). Health Statistics. Available at: http://www.nationmaster.com/graph/hea_phy_per_1000_peo-physicians-per-1-000-people.

Plous, S. (1993). The psychology of judgment and decision making. New York: McGraw-Hill.

Project Management Institute. (2013). A guide to the project management body of knowledge (PMBOK® guide) (5th ed.). Newtown Square, PA: Project Management Institute.

Thaler R. H., & Sunstein C. R. (2007). Nudge. Improving decisions about health, wealth, and happiness. New Haven and London: Yale University Press.

TransCanada. (2012). Keystone Pipeline System. Available at http://www.transcanada.com/oil-pipelines-projects.html.

Virine, L., & Trumper, M. (2013). ProjectThink. Why good managers make poor project choices. Gower Pub Co.

Virine, L., & Trumper, M. (2007). Project Decisions, The art and science. Vienna, VA: Management Concepts.

Wessel, D. (2009). In FED We Trust: Ben Bernanke's War on the Great Panic. Crown Business.

© 2013, Lev Virine
Originally published as a part of 2013 PMI Global Congress Proceedings – New Orleans, Louisiana

Advertisement

Advertisement

Related Content

  • Project Management Journal

    Top Ten Behavioral Biases in Project Management member content locked

    By Flyvbjerg, Bent This article identifies the 10 most important behavioral biases for project management.

  • Project Management Journal

    Perceived Complexity of a Project’s Optimal Work Plan Influences Its Likelihood of Adoption by Project Managers member content locked

    By Brokman-Meltzer, Mor | Perez, Dikla | Gelbard, Roy Perceived complexity is a factor when project managers adopt suboptimal work plans, even when optimal plans are readily accessible.

  • Project Management Journal

    Executives' Decision Processes at the Front End of Major Projects member content locked

    By Chenger, Denise | Woiceshyn, Jaana This article reports on an inductive multiple-case study of how executives made such decisions in major upstream oil and gas projects.

  • Project Management Journal

    The Missing Link in Project Governance member content locked

    By Ferrer, Paulo Sergio Scoleze | Araújo Galvão, Graziela Darla | de Carvalho, Marly Monteiro This study aims to understand how the information about corporate governance permeates the the project environment and influences decisions.

  • Project Management Journal

    Project-As-Practice member content locked

    By Kalogeropoulos, Theodoros | Leopoulos, Vrassidas | Kirytopoulos, Konstantinos | Ventoura, Zoe This article applies Bourdieu’s practice theory within the field of project management through a qualitative study into 17 successful and experienced Greek project managers.

Advertisement