Project Management Institute

Reframing perceptions

a practical guide for decision makers

Director, Act Knowledge Pty Ltd

Abstract

Expert communicators and negotiators understand the power of ‘framing’ statements in order to influence how ideas are perceived. Whether trying to sell a concept, convince an undecided stakeholder, or persuade an opponent of the merits of your case, success depends on your ability to overcome their objections and to help them see the issues from a different perspective.

As the literature on neuroscience and the psychology of influence continues to blossom, decision makers can easily become overwhelmed by the mass of theory appearing in the literature. This guided decision session cuts to the chase and provides participants with simple yet powerful techniques to reframe issues and help stakeholders gain a new perspective on the decisions they need to make. The session will briefly outline five principles of psychological influence and some examples of how they have been used in research; however, the primary focus of the session will be to provide participants with an opportunity to practically apply the principles to real-world situations faced by project managers.

The session will also include a few tips on dealing with stakeholders who attempt to frame issues in an unethical manner.

Introduction

“To frame is to select some aspects of a perceived reality and make them more salient in a communicating text, in such a way as to promote a particular problem definition, causal interpretation, moral evaluation, and/or treatment recommendation” (Entman, 1993, p. 52).

Persuasive communication is about what is said or not said; it is also about how it is said. Sales people, negotiators; in fact, everyone who makes a living from speaking, persuading, and influencing others, understands the power of framing statements in a way that makes them more believable and actionable.

As we grow, we build a series of mental filters through biological and cultural influences. We then use these filters and mind sets to make sense of the world around us. This creates a series of frames that shape our perception of reality and influence the future choices and decisions we make.

The Prospect Theory developed by Noble laureate Daniel Kahneman and his colleague Amos Tsversky in 1979 has shown that the frames by which choices or problems are presented have a significant impact on the actual decision made. Their work has led to an explosion of scientific research and popular literature, showing the power of frames to override many of the classic axioms of rational choice.

This paper and the accompanying guided design session address five such frames, all of which can be readily applied by project managers to help shape the perceptions and decisions of their stakeholders. The five were selected from the dozens of potential frames available because they do not require any changes in incentives, do not rely on threats or coercion, and are cost neutral. Rather, with a little thought and planning they can be employed in any project by any project manager.

This paper provides the theoretical background for the guided design session. The session itself will elaborate on the examples, and will provide participants with project-related scenarios against which to apply them.

Framing Perceptions — Influence or Manipulation?

‘As practitioners of project management, we are committed to doing what is right and honorable. We set high standards for ourselves and we aspire to meet these standards in all aspects of our lives—at work, at home, and in service to our profession’ (PMI Code of Ethics. Retrieved from http://www.pmi.org/About-Us/Ethics/Code-of-Ethics.aspx, 24 August 2012)

We must note right at the outset that the principles and techniques presented in this paper can be used in both ethical and unethical manners to either influence or manipulate stakeholders. They can deliver outcomes, which enable leaders to deliver value to the projects and organizations, or to further their own ends at the expense of the stakeholders with whom they interact.

The tools themselves are neutral; the impacts they wield reflect the intent of the person wielding them. It also bears repeating that influence equity™ is built by consistently demonstrating character and credibility, building relationships, and utilizing logic, which reflects the understanding of the listener (Oschadleus, 2007). This paper concentrates on the latter aspect, utilizing communication techniques that frame issues in a logical manner. In keeping with the PMI Code of Ethics, these tools and principles are presented with the intent of helping project managers deliver benefits and value to their organizations.

Five Frames

Research studies have uncovered dozens of different ways in which information can be reframed. However, the five frames presented in this paper have one factor in common: they do not rely on additional incentives or powers of reward or coercion and can all be applied to projects without requiring additional funds or authority.

The Loss Aversion Frame

The first frame we consider is that of ‘loss aversion,’ promulgated in the Prospect Theory as illustrated in the Asian Bird Disease exercise (Tversky & Kahneman, 1981). The experiment provided two groups of participants with a scenario and a choice of two options:

Imagine that the United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows:

Program A, in which 200 people will be saved.

Program B, which has a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved.

Which program would you favor?

The group with the positively framed options voted overwhelmingly (72%) in favor of the risk averse option (A). The second group received the same cover story, but their choices were framed in negative terms:

Program A, in which 400 people will die.

Program B, which has a one-third probability that nobody will die and a two-thirds probability that 600 people will die.

In this scenario, 78% of participants favored the risk seeking option. The reversal in voting patterns can be attributed to the manner in which the problem was presented (i.e., framed), not to the actual outcomes, which were identical in both cases. This finding has been replicated in numerous similar studies since then, supporting the original findings of Tversky and Kahneman (1981, p. 453) that ‘choices involving gains are often risk averse and choices involving losses are often risk taking.’

Levin, Schneider, and Gaeth (1998) suggest that the framing effect should be distinguished into three separate types:

  • Risky choice framing, such as the Asian disease situation outlined above, in which the risk attitude changes based on the perceived outcomes
  • Attribute framing, in which the characteristics of people or products are evaluated differently, depending on whether they are described in positive or negative terms (e.g., a beef burger comprising 80% beef rather than 20% fat), and
  • Goal framing, in which the differences between messages stress the positive consequences of performing an act versus the negative consequences of not performing an act (e.g., the positive outcomes of engaging in breast self-examination rather than the negative outcomes of not engaging in this act).

A literature review by Maule and Villejoubert (2007) indicate that additional factors come into play, such as the recipient's levels of cognition and involvement, the strength of existing attitudes, and the recipient's receptivity to the proposed frame. In other words, the message needs to be consistent with that individual's knowledge, experience, and intentions. Druckman (2001) illustrated this in a variation on the Asian Disease problem, in which the generic programs were replaced by options referred to as belonging to either the Republican or Democratic party. In this scenario, the framing effect was superseded by party loyalty. We will return to this principle in the final section of the paper.

The ‘Door in the Face’ Frame

The next frame is derived from the well-known negotiation strategy of providing extreme opening offers to the other party, and to then make concessions to settle at a lower position (which was the intended outcome). Cialdini et al. (1975) call this the ‘door in the face’ (DITF) strategy, and illustrate it with a study of compliance behaviors. The researchers approached three random groups of people with a request:

  1. Group 1: volunteer to counsel juvenile delinquents for two hours a week for two years (a large request). If they refused, they were then asked to chaperone juvenile delinquents on a one-day trip to the zoo (a small request).
  2. Group 2: chaperone juvenile delinquents on a one-day trip to the zoo (the small request).
  3. Group 3: outlined the large request but asked them to perform the small request.

The responses were surprising (actual figures will be presented during the guided design session), leading researchers to conclude that compliance increases after an initial rejection, because when the rejected party moderates its demands and asks for something less extreme it is seen as a concession — and concessions need to be reciprocated.

The implications are obvious: project managers often censor themselves because they believe extreme requests will be rejected. This is not necessarily true, although there are risks associated with making outrageous demands — your request may be considered ignorant, crazy, or plain offensive. One way to mitigate the possibility of offending the other party when making extreme opening demands is to ask for something that, although it may be rejected by the other party, can still be justified on the part of the requestor (Malhotra, 2010).

The Status Quo Bias Frame

The third frame under consideration is that of status quo bias, the tendency people have to stick with the default, fearing the risk of change more than the risk of failing to change. Johnson and Goldstein (2003) illustrated this with a study of organ donation practices in several European countries. In countries like Denmark, the Netherlands, the United Kingdom, and Germany, drivers' license applications require people to explicitly consent to donating their organs. The opt-in rate ranged from 4% to 27.5%; however, in countries where consent is presumed (e.g., Austria, Belgium, France, and Portugal), the opt-out rates were 2% or less, with only Sweden bucking the trend at 15%.

Various factors contribute to this tendency, including people not bothering to read what they are signing. But there is also a widespread belief that default represents the current norm and must do so for a reason. This principle provides a perfect segue into the fourth frame.

The Social Proof Frame

The principle of social proof, one of the six universal laws of influence identified by Cialdini (1984), suggests that when there is uncertainty or ambiguity regarding the appropriate course of action, people look to the behavior of similar others for guidance. The implication is that if we can identify influential leaders who already espouse similar positions to what we are trying to promote (or who are prepared to do so), we can enlist their support in selling our message.

This is the cornerstone underpinning the use of celebrities in marketing. But Cialdini (2004) reports on an experiment in which altering only three words in an infomercial completely reframed people's perceptions of how other viewers might be reacting to what they were watching. Instead of the traditional ‘Operators are waiting, please call now’ message that flashed at the bottom of the screen, one company used the phrase, ‘If operators are busy, please call again.’ The number of calls sky-rocketed!

Although both statements seem to convey identical information, they actually portray very different viewer behaviors. The traditional message suggests few if any people are calling and operators are simply waiting for the phone to ring. But the second message implies phones are ringing off the hook and viewers may have to call multiple times — because so many other people are already buying this product.

The Reference Point Frame

Another powerful cognitive bias is the reference point frame, in which perception of value is anchored by the first information we obtain about a subject. All future discussion is then built around that first reference point.

Anchoring in decision making occurs when we rely too heavily on a specific piece of information (often the first exposure we have to a subject) and allow it to govern our thought process. Once the anchor is set, there is a bias toward adjusting or interpreting other information to reflect the ‘anchored’ information.

Value is a relative concept, and we make judgments regarding the value of an item or idea based on salient reference points on which to focus. How people value their own interests and even time is subject to psychological influence. The opening of the Cross City Tunnel in Sydney, Australia, in 2006 illustrated this clearly — many motorists preferred spending an extra 30 minutes queuing to use the free surface road rather than paying the US$3.56 toll fee to use the new tunnel bypass.

It may seem irrational unless we recognize that people do not objectively evaluate the cost of an item or an issue; rather, they evaluate costs in comparison to salient reference points. Car salespeople are notorious for leveraging this insight when they pitch add-ons during the sale of a car. A buyer who has already agreed to pay US$30,000 for the vehicle will readily agree to paying an additional US$200 to US$500 for floor mats or scratch proofing; however, few buyers would purchase these add-ons one week after having purchased the car.

Aggregating Losses and Disaggregating Gains

This section of the guided design session will conclude with an illustration adapted from Thaler (1985), which emphasizes the point that people prefer to make gains piecemeal but to incur losses all at once. In their seminal work on prospect theory, Kahneman and Tversky (1979) explained that such preferences are a result of the way in which we evaluate the prospects of winning (i.e., gaining) or losing, relative to salient reference points (e.g., the status quo).

They argue that people have diminishing marginal utility associated with gains, but increasing marginal utility associated with losses. This means that additional gains are not as pleasurable as the initial gain and additional losses are not as painful as the initial loss. As a result, we like to have many small wins rather than one big win, but we prefer one lump-sum loss to multiple small losses.

This has significant implications for project managers when interacting with stakeholders, and the guided design session will identify a number of examples.

Defending Against Frames

The frames outlined in the previous section are all grounded in scientific research as well as in practice. As noted at the outset of the paper, these principles can be used to either influence or manipulate the perceptions and decisions of project stakeholders, and we are all susceptible to being affected by them.

It would be amiss not to briefly address the other side of the equation, namely to consider the defenses that can be utilized against attempts to use psychological influence in an unethical manner (based on work by Campbell, Whitehead & Finkelstein, 2009; Molhotra, 2010; Kahneman, 2011).

Defense Strategy 1: Become a Reflective Learner

One of the best ways to defend against influence strategies is to become a reflective learner and to continually challenge our own mental models (Oschadleus, 2011). We need to prepare systematically and comprehensively for interactions with our stakeholders, which entail understanding our desired outcomes and the issues at hand, and to think through our options and alternatives. If we do this, we are less likely to accept unfavorable outcomes simply because of the way in which they are presented or offered.

Defense Strategy 2: Create a Scoring System

A scoring system entails the formulation of a common metric with which to evaluate all issues being considered. It is the same principle as a weighted scoring system in project selection. For example, all possible outcomes on each issue may be converted into dollar or time values, which can then be objectively assessed against each other. Or, we might start with 100 points and allocate them across the issues in proportion to the relative importance of each issue. This allows us to objectively evaluate the total value of a proposed solution by comparing it with the total value of alternative offers. Inappropriate frames are less likely to unduly persuade people who can objectively evaluate every proposal being made.

Defense Strategy 3: Explicitly Separate Information from Influence

The human mind unconsciously filters all incoming signals and deletes, distorts, or generalizes information to fit preconceived beliefs (Oschadleus, 2009). Effective leaders understand that everything said by stakeholders is information and part influence: the task is to explicitly try and separate the two before reacting or responding. When a seemingly compelling statement is made, we should ask ourselves what new information we learned and how that relates to our interests and priorities. If we made a decision based on the information we just received, would we be willing and able to defend that decision in front of critical others?

Defense Strategy 4: Rephrase Information

Perhaps one of the most powerful questions we can learn to ask ourselves is simply this: how else could I interpret that information? Rephrase any substantive statements made by others and consider other ways in which to present the information. For example, rephrase a loss-frame statement using a gain-frame statement and see whether our decision would be the same. The key is to identify whether our reaction stems from the merits of the argument or the presentation of the argument.

Defense Strategy 5: Appoint a Devil's Advocate

As outlined in Oschadleus (2011), it is important to have a designated devil's advocate whose role it is to challenge our decisions and thought processes. Having a designated person act in this role can help us and our stakeholders identify blind spots or tendencies toward group think, without casting the critic in the mold of a naysayer.

Defense Strategy 6: Avoid Making Critical Decisions under Time Pressure

Psychological influence tactics are more likely to have an effect when we are asked to make quick decisions. Effective decision makers are prepared to set aside ample time to reflect, discuss, and review before committing to a decision. They are willing to defer important decisions to the next day, and are comfortable in both asking stakeholders for time and to offer stakeholders time to think through the issues.

In the zoo-trip incident it is highly likely that many people in the first group would have rejected the request to chaperone juvenile delinquents if they had the chance to sleep on it. Although that may not have served the purpose of the people seeking chaperones, it would have ensured that those agreeing to do so were fully committed to doing this. In other words, the decision would have been a more sustainable one.

Conclusion

This paper and the accompanying guided design session set out to provide five methods by which we can frame decisions in a manner that helps shape the perceptions of stakeholders in the direction we are seeking to influence them. We considered some examples from both scientific research and practical experience, and provided the opportunity for session participants to reflect on and apply these frames to real-world project decisions.

The paper also proposed six strategies we can employ to ensure the frames others use to present to us are in our best interests and the interests of our project and/or organization. Keeping these defense strategies in mind also helps us ensure the frames we present to others are grounded in ethical behavior.

References

Ariely, D. (2009). The end of rational economics. Harvard Business Review, 87(7/8), 78–84.

Campbell, A., Whitehead, J., & Finkelstein, S. (2009). Why good leaders make bad decisions. Harvard Business Review, 87(2), 60–66.

Campbell, A., & Whitehead, J. (2010). How to test your decision-making instincts. McKinsey Quarterly. Retrieved from https://www.mckinseyquarterly.com/How_to_test_your_decision-making_instincts_2598.

Cialdini, R.B., Vincent, J., Lewis, S., Catalan, J. Wheeler, D., & Darby, B. (1975). Reciprocal concessions procedure for inducing compliance: The door-in-the-face technique. Journal of Personality and Social Psychology, 31, 206–215.

Cialdini, R.B. (1984). Influence: The psychology of persuasion. Melbourne: The Business Library.

Cialdini, R.B. (2003). The power of persuasion: Putting the science of influence to work in fundraising. Stanford Social Innovation Review, 18–27.

Cialdini. R.B. (2004). Everybody's doing it. Negotiation, 7.

Entman, R.M. (1993). Framing: Toward clarification of a fractured paradigm. Journal of Communication, 43(4), 51–58.

Druckman, J.N. (2001). Using credible advice to overcome framing effects. Journal of Law, Economics, & Organization, 17, 62–82.

Finkelstein, S. (2004). Why smart executives fail – and what you can learn from their mistakes. New York: Portfolio.

Gladwell, M. (2006). Blink: The power of thinking without thinking. London: Penguin.

Johnson, E.J., & Goldstein, D. (2003). Do defaults save lives? Science, 302, 1338–1339.

Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47, 263–292.

Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64(6), 515–526.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.

Klein, G. (1998). Sources of power: How people make decisions. Cambridge, MA: MIT Press.

Levin, I.P., Schneider, S.L., & Gaeth, G.J. (1998). All frames are not created equal: A typology and critical analysis of framing effects. Organizational Behavior and Human Decision Processes, 76, 149–188.

Malhotra, D. (2010). Strategies of influence. Harvard Business School Module Note 5-910-039.

Maule, J., & Villejoubert, G. (2007). What lies beneath: Reframing framing effects. Thinking & Reasoning, 13(1), 25–44.

Meyerowitz, B. E., & Chaiken, S. (1987). The effect of message framing on breast self-examination: Attitudes, intentions, and behavior. Journal of Personality and Social Psychology, 52, 500–510.

Oschadleus, H.J. (2007). Increasing Influence Equity™ in Project Management. PMI Global Congress 2007, Hong Kong, China.

Oschadleus, H.J. (2009). Conversational Leadership: A Communication Tool to Lead and Influence Organisations. PMI Global Congress APAC 2009, Kuala Lampur, Malaysia.

Oschadleus, H.J. (2011). “I Think… I Thought… What Was I Thinking” – Assumptions that Drive Decision-Making. PMI Global Congress 2011, Dallas, TX.

Samuelson, W.F., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1, 7–59.

Thaler, R.H. (1985). Mental accounting and consumer choice. Marketing Science, 4, 199–214.

Thaler, R.H., & Shefrin, H. (1991). An economic theory of self control. Journal of Political Economy, 89, 392–406.

Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211, 453–458.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

© 2012, Jürgen Oschadleus
Originally published as a part of the 2012 PMI Global Congress Proceedings – Vancouver, BC, Canada

Advertisement

Advertisement

Related Content

Advertisement