I think...I thought...what was I thinking?
Assumptions that drive decision-making
Everyone experiences the “what was I thinking” moment at some point in life—that sinking realization that our decision-making skills have let us down; that we did not pick the best course of action. It’s that flash of insight we wish we had before we made a crucial decision or pursued a specific course of action.
Using real-life examples, this engaging presentation highlights the neuroscience behind mental blind spots and the development of thought patterns. It addresses both the benefits and the dangers inherent in intuitive decision-making, outlines a five-step reflective learning process, and suggests ways in which project leaders can overcome assumptions and overlooking the obvious through non-confrontational questioning techniques.
Project annals, corporate history books, and biographies of famous people are replete with examples of decisions, which, in hindsight, were obviously wrong. In many instances, the decision-makers themselves might figuratively (if not literally) smack themselves on the head as they berate their actions with those all too terrifying words, “What was I thinking?”
Take Marcus Endfeld, the retired Australian High Court judge who spent two years in prison for perjury when trying to avoid paying a A$77 speeding ticket; or, Zinedine Zidane, the famous French soccer captain whose superb 15-year playing career ended with a “brain snap” that had him sent off for head butting an opponent minutes from the end of what should have been his crowning glory, the 2006 World Cup Final; or, the Motorola management team that ignored internally generated data that even if the company captured the entire global market for international business calls from developing countries, they would still be unable to pay the costs associated with their Iridium satellite telephone project. The project failed within a year and the organization lost nearly US$5 billion. What about the Xerox executives who, despite warnings from their staff, allowed 24-year computer maker Steven Jobs free access to study their newly developed “toy,” the graphical user interface, thereby losing control of one of the most powerful technological innovations of the day.
What were they thinking?
Yet, what happened to them is repeated in different ways by countless people each day, impacting individual lives, projects, and even organizations around the globe. We have an all-too-human capacity for making decisions that we know to be wrong or discover to be wrong after the fact (sadly, in many instances we also refuse to accept that we could have been guilty of wrong thinking, and attempt to justify or rationalize our decision despite the evidence to the contrary).
This paper presents reasons why this occurs and suggests techniques leaders can use to lessen the risks of having to utter the words: “What was I thinking?”
Assumptions and Intuition in the Project and Corporate Worlds
Assumptions and the Project Manager
In late 2010, the author conducted a review of a project in which an organization sought to replace its hard-to-support, bespoken SCADA system with an off-the-shelf product (but which “must work exactly the way our old system does!”). Specifications were defined, a vendor selected by tender, and for the next 10 months the project had appeared to be proceeding fairly smoothly. Half way through the factory acceptance testing phase, over 500 issues had been logged and further testing was put on hold. Amid the recriminations, finger pointing, and arguing over what constituted a bug versus an incorrectly defined specification, the client requested a project review, and prepared a business case variation to add 15 months and several millions of dollars to the project.
Among other things, the review uncovered numerous examples of costly assumptions and ambiguities in the tender specifications and vendor responses. For example, the vendor had agreed that its system contained a built-in alarm capability. Yet, when testing this feature, the client’s response was: “That’s not an alarm; this is an alarm,” and then outlined what amounted to several weeks and tens of thousands of dollars of rework for that one misunderstanding.
In all fairness, the vendor and client could be excused for assuming they understood the meaning of the word “alarm,” Questioning the meaning of every word in every statement is not practical or possible; certain assumptions simply have to be made, and yet it is often these assumptions that come back to haunt us. The trick, then, is in knowing which words to question, clarify or specify, and when to do so.
This particular review also concluded that signs of poor assumptions were known to the project manager within six weeks of the vendor commencing work. In a fortnightly status update he reported: “We have slipped by two weeks because of technical misunderstandings, but we will make up the time later.” From then on, he regularly reported further slippages, followed by assurances that the time would be recovered in the future – and not once had the project board challenged him on his statements.
The assumptions and poor decisions made on this particular project led to significant frustration all around substantially delayed the project and cost the organization millions of additional dollars. But, fortunately, the project did eventually complete and deliver benefits. That is not always the case.
The Impact of Leadership Assumptions on Corporate Decisions
It is easy to ask in hindsight how the leaders of Enron, WorldCom, Lehman Brothers, and numerous others could have made decisions that led their organizations into extinction. What were they thinking? Why could they not see the inherent dangers confronting them? Why do smart leaders suddenly make bad decisions and then compound the problem by making even worse decisions? And why do they seem to repeat the same mistakes over and over?
In a major study of 51 corporate failures, Finkelstein (2004) identifies four patterns of destructive behavior that affect executives and lead to the collapse of their organizations:
- Flawed mindsets that distort an organization’s perception of reality, often because an organization becomes so focused on one “magic answer” or “holy grail” that it loses sight of what is happening around it. A knowledge observer to Motorola’s Iridium project described it as “... an issue of fantasy, of hysteria,” rather than one of strategy (Finkelstein, 2004, p. 152). While the executives could probably expense the US$3,000 phone and the US$8/minute call charge, their reality was not that of the majority of the 500,000 users they would need to find to make their system feasible.
- Delusional attitudes that keep this inaccurate reality in place. Collins (2009) adds to this the probability that success leads to arrogance and complacency at the organizational and individual levels. Leaders stop asking questions and believe their own hype and propaganda. [Klein (2009) states that we leap to conclusions and question our initial conclusions.
- A breakdown in communication systems developed to handle potentially urgent information
- Leadership qualities that prevent executives from correcting their course.
Hardly surprisingly, the study revealed that failures were most likely to occur during one of four primary transition stages in an organization’s life cycle, all of which fall into the realm of complex project management: the creation of new business ventures, dealing with innovation or change, managing mergers or acquisitions, and addressing new competitive pressures.
Although each of the organizations had their own unique issues leading up to the collapse, a common thread underpinning them is the inability of the executives to examine their own assumptions and mental blind spots. It is tempting to rationalize these errors as examples of stupidity, incompetence or a lack of effort or foresight, if not greed and deliberate fraud, or even a lack of organizational commitment to provide the resources necessary to pursue a specific strategic direction. The leaders in all these organizations were bright, competent people, but they succumbed to the illusions that they knew it enough, were smart enough, and would get away with it.
Assumptions, Intuition, and Learning
Assumptions are things we believe, either consciously or subconsciously, to be true, without requiring any supporting evidence. Project scope documents frequently contain a section labeled “Assumptions>” which lists the conscious decisions taken by stakeholders to treat unknowns as knowns, in order to move the project forward. The team can then either seek verification of the assumption as more information becomes available (coupled with amendments to the scope or plan), or can manage the risk around having made incorrect assumptions.
However, more commonly, our assumptions fit into the realm of “unknown unknowns,” i.e., the assumptions we make without even being aware of having done so. These assumptions develop as part and parcel of the learning process we undergo as our brain seeks to make sense of the world around us.
The Learning Cycle and Thought Patterns
The human brain is continually bombarded with thousands of internal and external sensory data that require processing. It copes by scanning for patterns and known objects, what Campbell, Whitehead and Finkelstein (2009) refer to as “pattern recognition,” a complex process that integrates information from up to 30 different parts of the brain to ascribe meaning to input being received. Our brain makes assumptions based on prior experiences and judgments, and couples these with “emotional tags” associated with each memory. These mental patterns and their related emotions enable us to rapidly, even intuitively, assimilate information, ascribe meaning to it and make decisions.
As outlined in previous papers (Oschadleus, 2007, 2008, & 2009), these patterns are developed through constant repetition, which creates connections within the 100 billion neurons in the human brain. The more frequently a particular series of synapses fire together, the more entrenched that neural path becomes, and the more likely that a mental map is likely to be utilized in future information processing functions.
Learning a new skill progresses through four stages, as illustrated in Exhibit 1.
[Note: this model was popularized by Gordon Training International in the 1970s, but its precise origins are unknown.]
While the goal of learning is to achieve the fourth stage, it is also the area of danger since this is the realm where our lack of conscious thought leads to assumptions and the potential for re-entering the zone of unconscious incompetence.
The Inevitability of Assumptions in the Learning Process
The conscious mind is limited to energy-intensive serial processing, and consequently seeks to relegate as much thinking as possible to the more powerful unconscious mind. It does so by subconsciously comparing incoming data with known objects. In its search for patterns and congruence, the brain will generalize, distort, or even delete incoming signals.
As children, our parents point out objects and get us to repeat the label, such as chair. Just when we think we understand chair, we are introduced to a different shape and are told this is also a chair, requiring us to adapt our mental model of what constitutes a chair. Quite literally, this requires the formation of new synaptic connections. This happens multiple times over the next few years, and the definition of what constitutes chair evolves as children continue asking the countless “Why... ?” questions, which drive their parents crazy. Eventually children learn to classify objects as belonging to the category chair based on its form and functionality, without have to consciously think about the definition of a chair. In fact, many adults would be hard-pressed to clearly and unambiguously define the term, and it is because of this intuitive, implicit knowledge that we often find it difficult to respond to all those “why” questions.
The more well-established specific patterns or mental models are, the better equipped we are to apply intuition to certain types of decision – and the less likely we are to challenge incorrect assumptions.
The Benefits of Intuitive Decision-Making
In his best seller, Blink: The Power of Thinking without Thinking, Malcolm Gladwell (2006) highlights the power of these mental patterns, which enable what he terms “rapid cognition”, or intuition, where the subconscious mind can process reams of data in milliseconds and arrive at a conclusion which matches, if not surpasses, the quality of more cautious and deliberate rational thinking. He provides numerous examples where this gut-feel has served leaders well in a broad range of situations: firefighters who just know a building is about to implode; art experts who instinctively feel a US$10 million sculpture is fake despite the certification delivered by dozens of experts over a 14-month period, a marriage analyst who can tell within minutes whether a couple will stay together...
Gladwell draws on research undertaken by psychologist Gary Klein (1998), who spent twenty years studying people operating under extreme pressure (intensive-care staff, Blackhawk helicopter pilots, firemen, M-1 tank commanders and others) and whose life-and-death decisions were made instantaneously, without the benefit of time to analyze options and alternatives. He asked them to explain how they made their decisions, but the typical answers he received were “experience” and “We just knew.” He discovered that the more expertise people gain, the more well developed their mental models become in that area, resulting in the ability to subconsciously perceive patterns and make intuitive decisions without always knowing the reasons why. They have a strong gut feel for what works and what does not. For example, in a series of studies with firefighters he realized they accumulate a storehouse of experiences over time, and subconsciously categorize fires according to how they should react to them. They create one mental catalog for fires that call for a search and rescue and another one for fires that require an interior attack. Then they race through their memories in a rapid search to find a prototypical fire that resembles the fire that they are confronting. As soon as they recognize the right match, they swing into action.
In other words, according to Klein, intuition is really a matter of learning how to see cues or patterns that ultimately show you what to do. He believes that formal analytical decision-making frameworks such as STAR (Stop, Think, Analyze, Respond), which may prove extremely useful to novices who need help in thinking their way through a problem, can slow down experienced decision-makers.
However, there is a corollary: this level of experience does not simply occur. The best decision makers Klein has seen are wild land firefighters, who are force-fed a constant diet of forest fires. They fight fires 12 months a year—in the western United States during the summer and in Australia and New Zealand during the winter—and rapidly build a base of experience. And, he adds, they are relentless about learning from experience. After every major fire, the command team runs a feedback session, reviews its performance, and then seeks out new lessons. All team members reflect deeply on the experience and what they have learned. It is equally important, he believes, that the people at the top all started at the bottom: they have experienced the fires and the exhaustion, and consequently build trust and confidence all the way down the line. In short, experience alone is insufficient to develop those fine intuitive skills.
This assertion is extended in the findings of Ericsson, Prietula, and Cokely (2007), who argue that becoming an expert in a particular field requires years of “deliberate practice” (i.e., practice that focuses on tasks beyond our current level of competence and comfort, as opposed to simply practicing what we are already good at). This finding became the basis of the “10,000 Hour Rule” (it takes 10,000 hours of practice to achieve expertise in any field) popularized in Gladwell’s (2008) Outliers and Kuper & Szymanski’s (2009) Soccernomics. Winston’s (2003) analysis of the functioning of the brain illustrates similar principles of how constant practice and reflection enables top cricket and tennis athletes to intuitively respond to their opponents within a fraction of a second.
Why Intuition Fails
Powerful as intuitive decision-making is, the previously cited examples are just some of the risks of not following more comprehensive thinking processes (that said, the rigor of analytical decisions is only as good as the quality of the data used and the questions asked).
Ericsson et al. (2007) also note that supposed experts do not necessarily guarantee superior performance, and that it is possible for expertise to decline with experience! They cite physicians who lose the ability to diagnose unusual heart and lung conditions because they encounter them so rarely and thus forget the characteristic features. This finding has a significant implication for leaders who rely on their memory of how things were when they were in the trenches, and explains in part why they allow incorrect assumptions to cloud their judgment.
Chabris and Simons (2010, p. 231) explain: “What we intuitively accept and believe is derived from what we collectively assume and understand, and intuition influences our decisions automatically and without reflection.” They go on to outline six myths that distort our intuition:
- We pay attention to far more than we really do. In fact, their famous “invisible gorilla” experiment illustrates that we can easily overlook signals directly in front of our own eyes simply because we are not looking for them. They also cite several experiments in which we tend to see more of the things we’re looking for, thus distorting our perception of reality.
- Our memories are more detailed and robust than they really are. Among the many examples they cite is the example of Hilary Clinton recalling a trip to war-torn Bosnia, in which she was adamant she had been fired upon, even though video footage showed her casually undertaking a “meet and greet” at the time.
- Confident people are competent people. People who seem well prepared and have all the answers must know what they are talking about – yet the most successful con artists are those whose confident demeanor allay concerns and inspire trust.
- That we know more than we really do. In reality, we know less that we would like to believe. For example, we intuitively know what a chair is, but find it hard to create a definition. We might think we know how things work, but when we have to start explaining it we find our knowledge is built on all sorts of assumptions.
- That coincidences and correlations demonstrate causation. Our conscious minds tend to work in a linear fashion and it is easy to mistake events occurring in sequence to have a cause and effect relationship between them when no such relationship exists. Frequently, we also fail to see the real causes which may have been at play over a period of years.
- That our brains have vast reserves of easily unlockable power. It is commonly stated that we only use 10% of our mind, but there is no means of verifying this “fact.”
Intuition or Analysis?
The question then is, when should leaders trust their gut instincts and intuition?
Leaders make decisions based on intuition, using the principles of pattern recognition and emotional tagging drawn from accumulated past experiences. When we inadvertently place a hand on a hot plate our body jolts and withdraws the hand well before we have had time to think about an appropriate reaction.
Neuroscientists believe the brain works in a similar way when we make more leisurely decisions. Our judgments are initiated by the unconscious weighing of emotional tags associated with our memories rather than by the conscious weighing of rational pros and cons: we start to feel something—often even before we are conscious of having thought anything. We cannot get away from our instincts because they (Campbell & Whitehead, 2010):
- influence the way we frame a situation
- influence the options we choose to analyze
- cause us to consult some people and pay less attention to others
- encourage us to collect more data in one area but not in another
- influence the amount of time and effort we put into decisions
In short, our intuition infiltrates our decision making even when we are trying to be analytical and rational. Chabris & Simon (2010) warn that first impressions are often locked in and become very difficult to dispel, and while they usually result in quick, effective decisions, they can be distorted by self-interest, emotional attachments, or misleading memories (Campbell, Whitehead & Finkelstein, 2009).
Consequently, our focus should be on ensuring that we create sufficient safeguards and draw on appropriate experiences and emotions.
Techniques to Challenge Mental Models
Some 2,000 years ago, the Apostle Paul wrote a letter to the Roman Church in which he admonished them to no longer be conformed to the pattern of the world, but to be transformed by the renewing of their minds (Romans 12:2). Several hundreds of years earlier King Solomon had stated: “As a man thinks in his heart, so is he” (Proverbs 23:3). Through the ages numerous philosophers have pointed to the power of thinking to define the quality of life.
To be effective, we need to continual validate our assumptions, and teach those around us to do the same. We need to stimulate in ourselves and others a curiosity to learn and improve, and to develop the people we engage with. And it starts with developing our own character, our relationships, and our ability to think.
The 5-Step Reflective Learning Cycle
Exhibit 1 presented a 4-stage learning cycle, which emerged in the early 1970s. That model requires an adaptation to reflect the need for continually challenging our own thinking through reflection and deliberate practice (Exhibit 2).
The process of reflection is the core difference between whether we repeat the same experience several times, becoming highly proficient at one behavior or whether we learn from the experience in such a way that we are cognitively or affectively changed. While experience may serve as the stimulus for learning, reflection is the essential part of the process that makes it possible to learn from experience.
Reflection should be part of the daily routine of every manager, but finding the time for deep reflection is always a challenge for the busy executive.
The reflection process is something we also need to instill in our teams as part of the regular routine – at the end of project phases, but also at the end of every day. Learning to question our own thinking and that of our teams is a powerful first step – but it is not enough. We also need safeguards to ensure that we are asking ourselves, our employees, and our managers the right questions.
Questions and Safeguards
One of the most important questions facing leaders is when they should trust their gut instincts—an issue explored in a dialogue between Nobel laureate Daniel Kahneman and psychologist Gary Klein titled “Strategic decisions: When can you trust your gut?” published by McKinsey Quarterly in March 2010. Our work on flawed decisions suggests that leaders cannot prevent gut instinct from influencing their judgments. What they can do is identify situations where it is likely to be biased and then strengthen the decision process to reduce the resulting risk.
- The familiarity test: Have we frequently experienced identical or similar situations in the past? Familiarity is important because our subconscious works on pattern recognition, and the more appropriate memories we have to scan, the more sound our judgment is likely to be. If the situation is unfamiliar, our “intuition” may be little more than a guess. Klein (1998) proposes a pre-mortem to discover ways in which projects could fail before commencing them as a means of dealing with uncertainty and our own experience to judge them well.
- The feedback test: Did we get reliable feedback in past situations? Past experience is only useful if we have learned the right lessons. Our decisions are tagged with a positive emotion when we make them, regardless of how an objective assessment would rate the decision. Consequently, if we do not get good feedback (because subordinates “protect” us from bad news or fear speaking their minds, or because rationalize away the evidently “wrong” feedback given to us) we never learn the truth about our decisions and are more likely to repeat the same mistakes again. While not particularly popular with busy executives, keeping a personal journal in which we document experiences helps to improve our self-awareness.
- The measured-emotions test: Are the emotions we have experienced in similar or related situations measured? All memories come with emotional tags, but not all tags are equal. When situations remind us of highly charged emotions, we are at risk of clouding our judgment.
- The independence test: Are we likely to be influenced by any inappropriate personal interests or attachments? If we have personal interests or attachments in the decisions we are making, we should be cautious since our emotional tags will be unevenly distributed. Perceived or real conflicts of interest also wreak havoc with intuition.
The moment we find a situation fails any one of these four tests, we need to step back and consider our decision-making process, using one of these three common techniques:
- Employing stronger governance, in the form of a boss, steering committee, or board who can overrule our decisions if necessary
- Seeking out additional experience and data (which could challenge assumptions, but would not protect us against self-interest
- Increasing our dialogue and challenge.
This paper set out to uncover the causes of mental blind spots and assumptions, and the origin of poor decisions—those “what was I thinking?” moments in life we have all experienced. It identified the power of intuition to make rapid decisions through the use of pattern recognition and emotional tagging, but also highlighted that these natural brain functions can be distorted by self-interest, emotions and personal attachments, leading to faulty decisions, some of which have significant implications on others.
It is impossible to ignore intuition; it will always be part of the decisions we make. The key to successful decision-making, according to Chabris and Simons (2010, p 235), is ‘knowing when to trust your intuition and when to be wary of it and do the hard work of thinking things through’.
The paper then outlined the need for reflective thinking at the personal, project and organizational level, and suggested a number of key questions that need to be asked whenever making a decision.
Ariely, D. (2009, July-August). The end of rational economics, Harvard Business Review, 87 (7/8), 78–84.
Breen, B. (2000, August). What’s your intuition? Fast Company, 38, Retrieved from http://www.fastcompany.com/magazine/38/klein.html.
Campbell, A., Whitehead, J., & Finkelstein, S. (2009, February). Why good leaders make bad decisions, Harvard Business Review, 87 (2), 60–66.
Campbell, A., & Whitehead, J, (2010, May). How to test your decision-making instincts, McKinsey Quarterly. Retrieved from https://www.mckinseyquarterly.com/How_to_test_your_decision-making_instincts_2598.
Chabris, C., & Simons, D. (2010) The Invisible Gorilla – and other ways our intuition deceives us, London: HarperCollins Publishers.
Collins, J. (2009). How the Mighty Fall – and why some companies never give in, London: Random House Business Books.
Dane, E., & Pratt, M.G. (2007). Exploring intuition in managerial decision-making, Academy of Management Review, 32(1), 33–54.
Ericsson, K.A., Prietula, M.J., & Cokely, E.T. (2007, July/August). The making of an expert, Harvard Business Review, 85 (7/8), 114–121.
Finkelstein, S. (2004). Why smart executives fail – and what you can learn from their mistakes. New York: Portfolio.
Gladwell, M. (2006). Blink: The power of thinking without thinking, London: Penguin.
Gladwell, M. (2008). Outliers: The story of success, New York: Little, Brown and Company.
Kahneman, D., & Klein, G. (2009, September). Conditions for intuitive expertise: A failure to disagree, American Psychologist 64 (6), 515–526.
Klein, G. (1998). Sources of power: How people make decisions, Cambridge, MA: MIT Press.
Oschadleus, H.J. (2007, January). Increasing Influence Equity™ in Project Management. PMI Global Congress 2007, Hong Kong, China.
Oschadleus, H.J. (2008, March). Change that Endures. PMI Global Congress APAC 2008, Sydney, Australia.
Oschadleus, H.J. (2009, February). Conversational Leadership: A Communication Tool to Lead and Influence Organisations. PMI Global Congress APAC 2009, Kuala Lampur, Malaysia.
Senge, P. (1990). The fifth discipline: The art and practice of the learning organisation, New York: Currency Doubleday.
Wilson, T.D. (2002), Strangers to ourselves: Discovering the adaptive unconscious, Cambridge, MA: Harvard University Press.
Winston, R. (2003). The human mind and how to make the most of it. London, Bantam Press: Transworld Publishers.
© 2011, Jürgen Oschadleus
Originally published as a part of 2011 PMI Global Congress Proceedings – Dallas, TX, USA