Mckinsey Quarterly on Behavioral Strategy Spring 2010

28 On the cover Seeing through biases in strategic decisions Artwork by Paul Wearing 29 30 The case for behavioral strategy Dan Lovallo and Olivier Sibony 46 How we do it: Three executives reflect on strategic decision making WPP’s Sir Martin Sorrell Kleiner Perkins’ Randy Komisar Xerox’s Anne Mulcahy 58 When can you trust your gut? A conversation between Daniel Kahneman and Gary Klein Also in this package: 44 68 A language to discuss biases Taking the bias out of meetings 30 The case for behavioral strategy Dan Lovallo and Olivier Sibony Left unchecked, subconscious biases will undermine strategic decision making.

Here’s how to counter them and improve corporate performance. Once heretical, behavioral economics is now mainstream. Money managers employ its insights about the limits of rationality in understanding investor behavior and exploiting stock-pricing anomalies. Policy makers use behavioral principles to boost participation in retirement-savings plans. Marketers now understand why some promotions entice consumers and others don’t. Yet very few corporate strategists making important decisions consciously take into account the cognitive biases—systematic tendencies to deviate from rational calculations—revealed by behavioral economics.

It’s easy to see why: unlike in fields such as finance and marketing, where executives can use psychology to make the most Dan Lovallo is a professor at the University of Sydney, a senior research fellow at the Institute for Business Innovation at the University of California, Berkeley, and an adviser to McKinsey; Olivier Sibony is a director in McKinsey’s Brussels office. 32 2010 Number 2 of the biases residing in others, in strategic decision making leaders need to recognize their own biases.

Get quality help now
Marrie pro writer
Verified

Proficient in: Environmental Science

5 (204)

“ She followed all my directions. It was really easy to contact her and respond very fast as well. ”

+84 relevant experts are online
Hire writer

So despite growing awareness of behavioral economics and numerous efforts by management writers, including ourselves, to make the case for its application, most executives have a justifiably difficult time knowing how to harness its power. 1 This is not to say that executives think their strategic decisions are perfect. In a recent McKinsey Quarterly survey of 2,207 executives, only 28 percent said that the quality of strategic decisions in their companies was generally good, 60 percent thought that bad decisions were about as frequent as good ones, and the remaining 12 percent thought good decisions were altogether infrequent. Our candid conversations with senior executives behind closed doors reveal a similar unease with the quality of decision making and confirm the significant body of research indicating that cognitive biases affect the most important strategic decisions made by the smartest managers in the best companies. Mergers routinely fail to deliver the expected synergies. 3 Strategic plans often ignore competitive responses. 4 And large investment projects are over budget and over time—over and over again. In this article, we share the results of new research quantifying the financial benefits of processes that “debias” strategic decisions. The size of this prize makes a strong case for practicing behavioral strategy—a style of strategic decision making that incorporates the lessons of psychology. It starts with the recognition that even if we try, like Baron Munchhausen, to escape the swamp of biases by pulling ourselves up by our own hair, we are unlikely to succeed. Instead, we need new norms for activities such as managing meetings (for more on running unbiased meetings, see pp. 8–69), gathering data, discussing analogies, and stimulating debate that together can diminish the impact of cognitive biases on critical decisions. To support those new norms, we also need a simple language for recognizing and discussing biases, one that is grounded in the reality of corporate life, as opposed to the sometimes-arcane language of academia. All this represents a significant commitment and, in some organizations, a profound cultural change. 1 See Charles Roxburgh, “Hidden flaws in strategy,” mckinseyquarterly. om, May 2003; and Dan P. Lovallo and Olivier Sibony, “Distortions and deceptions in strategic decisions,” mckinseyquarterly. com, February 2006. 2 See “Flaws in strategic decision making: McKinsey Global Survey Results,” mckinseyquarterly. com, January 2009. 3 See Dan Lovallo, Patrick Viguerie, Robert Uhlaner, and John Horn, “Deals without delusions,” Harvard Business Review, December 2007, Volume 85, Number 12, pp. 92–99. 4 See John T. Horn, Dan P. Lovallo, and S. Patrick Viguerie, “Beating the odds in market entry,” mckinseyquarterly. om, November 2005. 5 See Bent Flyvbjerg, Dan Lovallo, and Massimo Garbuio, “Delusion and deception in large infrastructure projects,” California Management Review, 2009, Volume 52, Number 1, pp. 170–93. Behavioral strategy print exhibits Exhibit 1 of 2 Glance: The research analyzed a variety of decisions in areas that included On products, M&A, and biases in strategic decisions investments in new the cover: Seeing through capital expenditures. Exhibit title: About the research 33 What we did 1,048 Number of decisions analyzed 6% Share of decisions related to M&A, organizational change, or expansion into new geographies, products, and services 51% Proportion of decisions that could be attributed to a single, speci? c business function (sales, R&D, marketing, manufacturing, or supply chain/distribution) The value of good decision processes Think of a large business decision your company made recently: a major acquisition, a large capital expenditure, a key technological choice, or a new-product launch. Three things went into it.

The decision almost certainly involved some fact gathering and analysis. It relied on the insights and judgment of a number of executives (a number sometimes as small as one). And it was reached after a process—sometimes very formal, sometimes completely informal—turned the data and judgment into a decision. Our research indicates that, contrary to what one might assume, good analysis in the hands of managers who have good judgment won’t naturally yield good decisions. The third ingredient—the process— is also crucial.

We discovered this by asking managers to report on both the nature of an important decision and the process through which it was reached. In all, we studied 1,048 major decisions made over the past five years, including investments in new products, M&A decisions, and large capital expenditures. industry variables. Behavioral strategy print exhibits Exhibit 2 of 2 Exhibit Process has a greater effect on decision making than analysis or Exhibit 2 of Process carries weight Glance: title:2 Glance: Process has a greater effect on decision making than analysis or industry variables.

Number 2 34 2010 industry variables. Exhibit title: Process carries weight Exhibit title: Process carries weight Process, analysis, and industry variables explain decision-making effectiveness explain explain Process, analysis, and industry variables Share of performance explained by given element (based on multivariate regressionand industry variables Process, analysis, analysis), % decision-making effectiveness decision-making effectiveness Share of performance explained by givenQuantity and detail of analysis element (based on multivariate regression analysis), givenperformed—eg, detailed ? ancial Share of performance explained by % element modeling, sensitivity analysis, analysis of (based on multivariate regression analysis), % ? nancial reaction of markets Quantity and detail of analysis performed—eg, detailed ? nancial Quantity and detail of analysis 8 modeling, sensitivity analysis, analysis of performed—eg, detailed ? nancial ? nancial reaction of markets analysis of modeling, sensitivity analysis, Quality of process to exploit ? ancial reaction of marketsanalysis and reach decision—eg, Industry/company variables—eg, 8 explicit exploration of major uncertainties, number of investment opportunities, 53 8 inclusion of perspectives that contradict 39 capital availability, predictability of Quality of process view, allowing senior leader’s point of to exploit consumer tastes, availability of resources analysis anddiscussion by skill and Quality of in reach decision—eg, participation process to exploit Industry/company to implement decision variables—eg, explicit exploration of by rank analysis rather than decision—eg, experienceand reachmajor uncertainties, number of investment opportunities, 53 Industry/company variables—eg, inclusion of perspectives thatuncertainties, explicit xploration of major contradict 39 capital availability, predictability of number of investment opportunities, senior leader’s point of view, allowing 53 inclusion of perspectives that contradict consumer tastes, availability of resources 39 capital availability, predictability of participation in point of view, allowing senior leader’s discussion by skill and to implement decision consumer tastes, availability of resources experience rather than by rank and participation in discussion by skill to implement decision experience rather than by rank Note: To evaluate decision-making effectiveness, we asked respondents to assess outcomes along four dimensions: revenue, profitability, market share, and productivity. Note: To evaluate decision-making effectiveness, we asked respondents to assess outcomesevaluate decision-making effectiveness, we asked respondents to productivity.

Note: To along four dimensions: revenue, profitability, market Difference in ROI between top- and bottom-quartile share, and assess outcomes inputs, percentage points decision along four dimensions: revenue, profitability, market share, and productivity. they had applied We asked managers to report on the extent to which Difference in ROI between top- and bottom-quartile to do with the quantity and detail of the analysis: did you, for example, decision inputs, percentage points and bottom-quartile Difference in ROI between topdecision inputs, percentage points 17 practices in making that decision. Eight of these practices had build a detailed financial model or run sensitivity analyses?

The others described the decision-making process: for instance, did you explicitly explore and discuss major uncertainties or discuss viewpoints that contradicted the senior leader’s? We chose these process Quality of process to exploitacademic research and in our experience, characteristics because in 6. 9 analysis and reach decision they have proved effective at overcoming biases. 6 Quantity and detailto exploit Quality of process of 5. 3 After controlling decision factors size, analysisof process for exploit like industry, geography, and company6. 9 and reach Quality performedto we used regression analysis analysis and reach decision to calculate how much of the variance 5. 3 6. 9 in decision outcomes7 Quantity and detail of was explained by the quality of the process and analysis performed of Quantity and detail 5. analysis performed 6 Research like this is challenging because of what International Institute for Management Development (IMD) professor Phil Rosenzweig calls the “halo effect”: the tendency of people to believe that when their companies are successful or a decision turns out well, their actions were important contributors (see Phil Rosenzweig, “The halo effect, and other managerial delusions,” mckinseyquarterly. com, February 2007). We sought to mitigate the halo effect by asking respondents to focus on a typical decision process in their companies and to list several decisions before landing on one for detailed questioning. Next, we asked analytical and process questions about the specific decision for the bulk of the survey. Finally, at the very end of it, we asked about performance metrics. 7 We asked respondents to assess outcomes along four dimensions: revenue, profitability, market share, and productivity. to implement decision experience rather than by rank On the cover: Seeing through biases in strategic decisions 35

Note: To evaluate decision-making effectiveness, we asked respondents to assess outcomes along four dimensions: revenue, profitability, market share, and productivity. Difference in ROI between top- and bottom-quartile decision inputs, percentage points Quality of process to exploit analysis and reach decision Quantity and detail of analysis performed 6. 9 5. 3 how much by the quantity and detail of the analysis. The answer: process mattered more than analysis—by a factor of six. This finding does not mean that analysis is unimportant, as a closer look at the data reveals: almost no decisions in our sample made through a very strong process were backed by very poor analysis. Why?

Because one of the things an unbiased decision-making process will do is ferret out poor analysis. The reverse is not true; superb analysis is useless unless the decision process gives it a fair hearing. To get a sense of the value at stake, we also assessed the return on investment (ROI) of decisions characterized by a superior process. 8 The analysis revealed that raising a company’s game from the bottom to the top quartile on the decision-making process improved its ROI by 6. 9 percentage points. The ROI advantage for top-quartile versus bottom-quartile analytics was 5. 3 percentage points, further underscoring the tight relationship between process and analysis. Good process, in short, isn’t just good hygiene; it’s good business. 8

This analysis covers the subset of 673 (out of all 1,048) decisions for which ROI data were available. 36 2010 Number 2 The building blocks of behavioral strategy Any seasoned executive will of course recognize some biases and take them into account. That is what we do when we apply a discount factor to a plan from a direct report (correcting for that person’s overoptimism). That is also what we do when we fear that one person’s recommendation may be colored by self-interest and ask a neutral third party for an independent opinion. However, academic research and empirical observation suggest that these corrections are too inexact and limited to be helpful.

The prevalence of biases in corporate decisions is partly a function of habit, training, executive selection, and corporate culture. But most fundamentally, biases are pervasive because they are a product of human nature—hard-wired and highly resistant to feedback, however brutal. For example, drivers laid up in hospitals for traffic accidents they themselves caused overestimate their driving abilities just as much as the rest of us do. 9 Improving strategic decision making therefore requires not only trying to limit our own (and others’) biases but also orchestrating a decisionmaking process that will confront different biases and limit their impact.

To use a judicial analogy, we cannot trust the judges or the jurors to be infallible; they are, after all, human. But as citizens, we can expect verdicts to be rendered by juries and trials to follow the rules of due process. It is through teamwork, and the process that organizes it, that we seek a high-quality outcome. Building such a process for strategic decision making requires an understanding of the biases the process needs to address. In the discussion that follows, we focus on the subset of biases we have found to be most relevant for executives and classify those biases into five simple, business-oriented groupings (for more on these groupings, see pp. 44–45).

A familiarity with this classification is useful in itself because, as the psychologist and Nobel laureate in economics Daniel Kahneman has pointed out, the odds of defeating biases in a group setting rise when discussion of them is widespread. But familiarity alone isn’t enough to ensure unbiased decision making, so as we discuss each family of bias, we also provide some general principles and specific examples of practices that can help counteract it. Counter pattern-recognition biases by changing the angle of vision The ability to identify patterns helps set humans apart but also carries with it a risk of misinterpreting conceptual relationships. Common 9 Caroline E. Preston and Stanley Harris, “Psychology of drivers in traffic accidents,” Journal of Applied Psychology, 1965, Volume 49, Number 4, pp. 284–88. On the cover: Seeing through biases in strategic decisions 37

In most organizations, an executive who projects great confidence in a plan is more likely to get it approved than one who lays out all the risks and uncertainties surrounding it pattern-recognition biases include saliency biases (which lead us to overweight recent or highly memorable events) and the confirmation bias (the tendency, once a hypothesis has been formed, to ignore evidence that would disprove it). Particularly imperiled are senior executives, whose deep experience boosts the odds that they will rely on analogies, from their own experience, that may turn out to be misleading. 10 Whenever analogies, comparisons, or salient examples are used to justify a decision, and whenever convincing champions use their powers of persuasion to tell a compelling story, patternrecognition biases may be at work.

Pattern recognition is second nature to all of us—and often quite valuable— so fighting biases associated with it is challenging. The best we can do is to change the angle of vision by encouraging participants to see facts in a different light and to test alternative hypotheses to explain those facts. This practice starts with things as simple as field and customer visits. It continues with meeting-management techniques such as reframing or role reversal, which encourage participants to formulate alternative explanations for the evidence with which they are presented. It can also leverage tools, such as competitive war games, that promote out-of-the-box thinking. Sometimes, simply coaxing managers to articulate the experiences influencing them is valuable.

According to Kleiner Perkins partner Randy Komisar, for example, a contentious discussion over manufacturing strategy at the start-up WebTV 11 suddenly became much more manageable once it was clear that the preferences of executives about which strategy to pursue stemmed from their previous career 10 11 For more on misleading experiences, see Sydney Finkelstein, Jo Whitehead, and Andrew Campbell, Think Again: Why Good Leaders Make Bad Decisions and How to Keep It from Happening to You, Boston: Harvard Business Press, 2008. WebTV is now MSN TV. 38 2010 Number 2 experience. When that realization came, he told us, there was immediately a “sense of exhaling in the room. Managers with software experience were frightened about building hardware; managers with hardware experience were afraid of ceding control to contract manufacturers. Getting these experiences into the open helped WebTV’s management team become aware of the pattern recognition they triggered and see more clearly the pros and cons of both options. Ultimately, WebTV’s executives decided both to outsource hardware production to large electronics makers and, heeding the worries of executives with hardware experience, to establish a manufacturing line in Mexico as a backup, in case the contractors did not deliver in time for the Christmas season. That in fact happened, and the backup plan, which would not have existed without a decision process that changed the angle of vision, “saved the company. Another useful means of changing the angle of vision is to make it wider by creating a reasonably large—in our experience at least six—set of similar endeavors for comparative analysis. For example, in an effort to improve US military effectiveness in Iraq in 2004, Colonel Kalev Sepp—by himself, in 36 hours—developed a reference class of 53 similar counterinsurgency conflicts, complete with strategies and outcomes. This effort informed subsequent policy changes. 12 Counter action-oriented biases by recognizing uncertainty Most executives rightly feel a need to take action. However, the actions we take are often prompted by excessive optimism about the future and especially about our own ability to influence it.

Ask yourself how many plans you have reviewed that turned out to be based on overly optimistic forecasts of market potential or underestimated competitive responses. When you or your people feel—especially under pressure—an urge to take action and an attractive plan presents itself, chances are good that some elements of overconfidence have tainted it. To make matters worse, the culture of many organizations suppresses uncertainty and rewards behavior that ignores it. For instance, in most organizations, an executive who projects great confidence in a plan is more likely to get it approved than one who lays out all the risks and uncertainties surrounding it.

Seldom do we see confidence as a warning sign—a hint that overconfidence, overoptimism, and other actionoriented biases may be at work. Superior decision-making processes counteract action-oriented biases by promoting the recognition of uncertainty. For example, it often 12 Thomas E. Ricks, Fiasco: The American Military Adventure in Iraq, New York: Penguin Press, 2006, pp. 393–94. On the cover: Seeing through biases in strategic decisions 39 helps to make a clear and explicit distinction between decision meetings, where leaders should embrace uncertainty while encouraging dissent, and implementation meetings, where it’s time for executives to move forward together.

Also valuable are tools—such as scenario planning, decision trees, and the “premortem” championed by research psychologist Gary Klein (for more on the premortem, see p. 64)— that force consideration of many potential outcomes. And at the time of a major decision, it’s critical to discuss which metrics need to be monitored to highlight necessary course corrections quickly. Counter stability biases by shaking things up In contrast to action biases, stability biases make us less prone to depart from the status quo than we should be. This category includes anchoring—the powerful impact an initial idea or number has on the subsequent strategic conversation. For instance, last year’s numbers are an implicit but extremely powerful anchor in any budget review. ) Stability biases also include loss aversion—the well-documented tendency to feel losses more acutely than equivalent gains—and the sunkcost fallacy, which can lead companies to hold on to businesses they should divest. 13 One way of diagnosing your company’s susceptibility to stability biases is to compare decisions over time. For example, try mapping the percentage of total new investment each division of the company receives year after year. If that percentage is stable but the divisions’ growth opportunities are not, this finding is cause for concern—and quite a common one.

Our research indicates, for example, that in multibusiness corporations over a 15-year time horizon, there is a near-perfect correlation between a business unit’s current share of the capital expenditure budget and its budget share in the previous year. A similar inertia often bedevils advertising budgets and R&D project pipelines. One way to help managers shake things up is to establish stretch targets that are impossible to achieve through “business as usual. ” Zerobased (or clean-sheet) budgeting sounds promising, but in our experience companies use this approach only when they are in dire straits. An alternative is to start by reducing each reporting unit’s budget by a fixed percentage (for instance, 10 percent). The resulting tough choices facilitate the redeployment of resources to more valuable opportunities.

Finally, challenging budget allocations at a more granular level can help companies reprioritize their investments. 14 13 See John T. Horn, Dan P. Lovallo, and S. Patrick Viguerie, “Learning to let go: Making better exit decisions,” mckinseyquarterly. com, May 2006. 14 For more on reviewing the growth opportunities available across different micromarkets ranging in size from $50 million to $200 million, rather than across business units as a whole, see Mehrdad Baghai, Sven Smit, and Patrick Viguerie, “Is your growth strategy flying blind? ” Harvard Business Review, May 2009, Volume 87, Number 5, pp. 86–96. 40 2010 Number 2 Counter interest biases by making them explicit Misaligned incentives are a major source of bias. Silo thinking,” in which organizational units defend their own interests, is its most easily detectable manifestation. Furthermore, senior executives sometimes honestly view the goals of a company differently because of their different roles or functional expertise. Heated discussions in which participants seem to see issues from completely different perspectives often reflect the presence of different (and generally unspoken) interest biases. The truth is that adopting a sufficiently broad (and realistic) definition of “interests,” including reputation, career options, and individual preferences, leads to the inescapable conclusion that there will always be conflicts between one manager and another and between individual managers and the company as a whole.

Strong decision-making processes explicitly account for diverging interests. For example, if before the time of a decision, strategists formulate precisely the criteria that will and won’t be used to evaluate it, they make it more difficult for individual managers to change the terms of the debate to make their preferred actions seem more attractive. Similarly, populating meetings or teams with participants whose interests clash can reduce the likelihood that one set of interests will undermine thoughtful decision making. Counter social biases by depersonalizing debate Social biases are sometimes interpreted as corporate politics but in fact are deep-rooted human tendencies.

Even when nothing is at stake, we tend to conform to the dominant views of the group we belong to (and of its leader). 15 Many organizations compound these tendencies because of both strong corporate cultures and incentives to conform. An absence of dissent is a strong warning sign. Social biases also are likely to prevail in discussions where everyone in the room knows the views of the ultimate decision maker (and assumes that the leader is unlikely to change her mind). Countless techniques exist to stimulate debate among executive teams, and many are simple to learn and practice. (For more on promoting debate, see suggestions from Kleiner Perkins’ Randy Komisar on pp. 0– 51, as well as from Xerox’s Anne Mulcahy on pp. 55–56. ) But tools per se won’t create debate: that is a matter of behavior. Genuine debate requires diversity in the backgrounds and personalities of the decision makers, a climate of trust, and a culture in which discussions are depersonalized. 15 The Asch conformity experiments, conducted during the 1950s, are a classic example of this dynamic. In the experiments, individuals gave clearly incorrect answers to simple questions after confederates of the experimenter gave the same incorrect answers aloud. See Solomon E. Asch, “Opinions and social pressure,” Scientific American, 1955, Volume 193, Number 5, pp. 31–35.

On the cover: Seeing through biases in strategic decisions 41 Populating meetings or teams with participants whose interests clash can reduce the likelihood that one set of interests will undermine thoughtful decision making Most crucially, debate calls for senior leaders who genuinely believe in the collective intelligence of a high-caliber management team. Such executives see themselves serving not only as the ultimate decision makers but also as the orchestrators of disciplined decision processes. They shape management teams with the humility to encourage dissent and the self-confidence and mutual trust to practice vigorous debate without damaging personal relationships.

We do not suggest that CEOs should become humble listeners who rely solely on the consensus of their teams—that would substitute one simplistic stereotype for another. But we do believe that behavioral strategy will founder without their leadership and role modeling. Four steps to adopting behavioral strategy Our readers will probably recognize some of these ideas and tools as techniques they have used in the past. But techniques by themselves will not improve the quality of decisions. Nothing is easier, after all, than orchestrating a perfunctory debate to justify a decision already made (or thought to be made) by the CEO. Leaders who want to shape the decision-making style of their companies must commit themselves to a new path. 1 Decide which decisions warrant the effort

Some executives fear that applying the principles we describe here could be divisive, counterproductive, or simply too time consuming (for more on the dangers of decision paralysis, see the commentary by WPP’s Sir Martin Sorrell on p. 47). We share this concern and do not suggest applying these principles to all decisions. Here again, the judicial analogy is instructive. Just as higher standards of process apply in a capital case than in a proceeding before a small-claims court, companies can and should pay special attention to two types of decisions. 42 2010 Number 2 The first set consists of rare, one-of-a-kind strategic decisions. Major mergers and acquisitions, “bet the company” investments, and crucial technological choices fall in this category.

In most companies, these decisions are made by a small subgroup of the executive team, using an ad hoc, informal, and often iterative process. The second set includes repetitive but high-stakes decisions that shape a company’s strategy over time. In most companies, there are generally no more than one or two such crucial processes, such as R&D allocations in a pharmaceutical company, investment decisions in a private-equity firm, or capital expenditure decisions in a utility. Formal processes—often affected by biases—are typically in place to make these decisions. 2 Identify the biases most likely to affect critical decisions Open discussion of the biases that may be undermining decision making is invaluable.

It can be stimulated both by conducting postmortems of past decisions and by observing current decision processes. Are we at risk, in this meeting, of being too action oriented? Do I see someone who thinks he recognizes a pattern but whose choice of analogies seems misleading to me? Are we seeing biases combine to create dysfunctional patterns that, when repeated in an organization, can become cultural traits? For example, is the combination of social and status quo biases creating a culture of consensus-based inertia? This discussion will help surface the biases to which the decision process under review is particularly prone. 3 Select practices and tools to counter the most relevant biases

Companies should select mechanisms that are appropriate to the type of decision at hand, to their culture, and to the decision-making styles of their leaders. For instance, one company we know counters social biases by organizing, as part of its annual planning cycle, a systematic challenge by outsiders to its business units’ plans. Another fights pattern-recognition biases by asking managers who present a recommendation to share the raw data supporting it, so other executives in this analytically minded company can try to discern alternative patterns. If, as you read these lines, you have already thought of three reasons these techniques won’t work in your own company’s culture, you are probably right. The question is which ones will.

Adopting behavioral strategy means not only embracing the broad principles set forth above but also selecting and tailoring specific debiasing practices to turn the principles into action. On the cover: Seeing through biases in strategic decisions 43 4 Embed practices in formal processes By embedding these practices in formal corporate operating procedures (such as capital-investment approval processes or R&D reviews), executives can ensure that such techniques are used with some regularity and not just when the ultimate decision maker feels unusually uncertain about which call to make. One reason it’s important to embed these practices in recurring procedures is that everything we know about the tendency toward overconfidence suggests that it is unwise to rely on one’s instincts to decide when to rely on one’s instincts!

Another is that good decision making requires practice as a management team: without regular opportunities, the team will agree in principle on the techniques it should use but lack the experience (and the mutual trust) to use them effectively. The behavioral-strategy journey requires effort and the commitment of senior leadership, but the payoff—better decisions, not to mention more engaged managers—makes it one of the most valuable strategic investments organizations can make. Copyright © 2010 McKinsey & Company. All rights reserved. We welcome your comments on this article. Please send them to quarterly_comments@mckinsey. com. 44 2010 Number 2 A language to discuss biases

Psychologists and behavioral economists have identified dozens of cognitive biases. The typology we present here is not meant to be exhaustive but rather to focus on those biases that occur most frequently and that have the largest impact on business decisions. As these groupings make clear, one of the insidious things about cognitive biases is their close relationship with the rules of thumb and mind-sets that often serve managers well. For example, many a seasoned executive rightly prides herself on pattern-recognition skills cultivated over the years. Similarly, seeking consensus when making a decision is often not a failing but a condition of success.

And valuing stability rather than “rocking the boat” or “fixing what ain’t broke” is a sound management precept. This bias typology was prepared by Dan Lovallo and Olivier Sibony. Action-oriented biases drive us to take action less thoughtfully than we should. Excessive optimism. The tendency for people to be overoptimistic about the outcome of planned actions, to overestimate the likelihood of positive events, and to underestimate the likelihood of negative ones. Overconfidence. Overestimating our skill level relative to others’, leading us to overestimate our ability to affect future outcomes, take credit for past outcomes, and neglect the role of chance. Competitor neglect.

The tendency to plan without factoring in competitive responses, as if one is playing tennis against a wall, not a live opponent. Interest biases arise in the presence of conflicting incentives, including nonmonetary and even purely emotional ones. Misaligned individual incentives. Incentives for individuals in organizations to adopt views or to seek outcomes favorable to their unit or themselves, at the expense of the overall interest of the company. These self-serving views are often held genuinely, not cynically. Inappropriate attachments. Emotional attachment of individuals to people or elements of the business (such as legacy products or brands), creating a misalignment of interests. 1 Misaligned perception of corporate goals.

Disagreements (often unspoken) about the hierarchy or relative weight of objectives pursued by the organization and about the tradeoffs between them. 1 Sydney Finkelstein, Jo Whitehead, and Andrew Campbell, Think Again: Why Good Leaders Make Bad Decisions and How to Keep It fromHappening to You, Boston: Harvard Business Press, 2008. On the cover: Seeing through biases in strategic decisions 45 Pattern-recognition biases lead us to recognize patterns even where there are none. Confirmation bias. The overweighting of evidence consistent with a favored belief, underweighting of evidence against a favored belief, or failure to search impartially for evidence. Management by example.

Generalizing based on examples that are particularly recent or memorable. False analogies—especially, misleading experiences. Relying on comparisons with situations that are not directly comparable. Power of storytelling. The tendency to remember and to believe more easily a set of facts when they are presented as part of a coherent story. Champion bias. The tendency to evaluate a plan or proposal based on the track record of the person presenting it, more than on the facts supporting it. Stability biases create a tendency toward inertia in the presence of uncertainty. Anchoring and insufficient adjustment. Rooting oneself to an initial value, leading to insufficient adjustments of subsequent estimates.

Loss aversion. The tendency to feel losses more acutely than gains of the same amount, making us more riskaverse than a rational calculation would suggest. Sunk-cost fallacy. Paying attention to historical costs that are not recoverable when considering future courses of action. Status quo bias. Preference for the status quo in the absence of pressure to change it. Social biases arise from the preference for harmony over conflict. Groupthink. Striving for consensus at the cost of a realistic appraisal of alternative courses of action. Sunflower management. Tendency for groups to align with the views of their leaders, whether expressed or assumed.

To listen to the authors narrate a more comprehensive presentation of these biases and the ways they can combine to create dysfunctional patterns in corporate cultures, visit mckinseyquarterly. com. 46 How we do it: Three executives reflect on strategic decision making WPP’s Sir Martin Sorrell Kleiner Perkins’ Randy Komisar Xerox’s Anne Mulcahy 47 ‘ Learn from mistakes and listen to feedback Sir Martin Sorrell ’ Sir Martin Sorrell is chief executive officer of WPP, a leading advertising and marketing-services group. Sir Martin actively supports the advancement of international business schools, advising Harvard, IESE, the London Business School, and Indian School of Business, among others.

The reality is that leaders must, on the spur of the moment, be able to react rapidly and grasp opportunities. Ultimately, therefore, I think that the best process to reduce the risk of bad decisions—whatever series of tests, hurdles, and measuring sticks one applies—should be quick, flexible, and largely informal. It’s important to experiment, to be open to intuition, and to listen to flashes of inspiration. This is not to say the process shouldn’t be rigorous: run the analyses, suck up all the data, and include some formal processes as well. But don’t ask hundreds of people. Carefully sound out the relevant constituencies— clients, suppliers, competitors—and try to find someone you trust who has no agenda about the issue at hand.

There will be mistakes, of course. The truth is we all make mistakes all the time. For instance, I know it’s true that decision makers risk escalating their commitment to losing endeavors that they have an emotional stake in. I know because I’ve been guilty of that myself. However, the only way to avoid making mistakes is to avoid making decisions (or, at least, very few). But then the company would grind to a halt. Instead, learn from mistakes and listen to feedback. Copyright © 2010 McKinsey & Company. All rights reserved. We welcome your comments on this article. Please send them to quarterly_comments@mckinsey. com. 48 ‘ Balance out biases Randy Komisar ’

Before behavioral economics even had a name, it shook up Randy Komisar’s career. He became aware of the then-nascent field while contemplating a graduate degree in economics, losing confidence in the dismal science as a result. Komisar ultimately shifted gears, becoming a lawyer and later pursuing a career in commerce. He cofounded Claris,1 served as CEO for LucasArts Entertainment and Crystal Dynamics, served as “virtual CEO” for a host of companies such as WebTV 2 and TiVo and, since 2005, has been a partner at Kleiner Perkins Caufield & Byers, the Silicon Valley venture capital fund. Along the way, he has developed a distinct point of view on ow to create executive teams and cultural environments that are conducive to good decision making. In a recent interview with McKinsey’s Olivier Sibony and Allen Webb, Komisar provided practical advice for senior executives hoping to make good decisions in a world where bias is inevitable. 1 2 Claris is now FileMaker. WebTV is now MSN TV. 49 Randy Komisar is a partner with Kleiner Perkins Caufield & Byers. He is the author of The Monk and the Riddle and coauthor, with John Mullins, of Getting to Plan B: Breaking Through to a Better Business Model. Randy has been a consulting professor of entrepreneurship at Stanford, where he still lectures.

Harness bias Rather than trying to tune out bias, my focus is on recognizing, encouraging, and balancing bias within effective decision making. I came to that conclusion as I was starting my career, when I had a chance to work with Bill Campbell, who is well known, particularly in Silicon Valley, as a leader and coach. Bill was the CEO of Intuit (where he’s now chairman), he’s on the Apple board, and he’s a consigliore to Google. 3 What I observed back then was that Bill had this amazing ability to bring together a ragtag team of exceptionally talented people. Some had worked for successful companies, some had not. Some had been senior managers.

Some had been individual contributors. Everybody brought to the table biases borne out of their domains and their experiences. Those experience-based biases probably are not that 3 For more on Bill Campbell, see Lenny Mendonca and Kevin D. Sneader, “Coaching innovation: An interview with Intuit’s Bill Campbell,” mckinseyquarterly. com, February 2007. 50 2010 Number 2 different at the psychological level from the behavioral biases that economists focus on today. Bill was very capable at balancing out the biases around the table and coming up with really effective decisions and, more important, the groundwork for consensus—not necessarily unanimity, but consensus.

I liken it to what I have always understood, true or false, about how President Kennedy ran his cabinet: that he used to assemble the smartest people he could, throw a difficult issue on the table, and watch them debate it. Then at some point he would end the debate, make a decision, and move on. It’s also similar to the judicial process, where advocates come together to present every facet of a case, and a judge makes an informed determination. The advocates’ biases actually work to the benefit of a good decision, rather than being something that needs to be mitigated. Make a balance sheet There’s a methodology I’ve used within companies for making big, hard decisions that I introduced into Kleiner Perkins and that we have been using lately to help decide whether or not to invest in new ventures. It starts with assembling a group that is very diverse.

If you look at my partners, you’d see an unruly gang of talented people with very different experiences, very different domain skills, and, consequently, very different opinions. Starting with that, the notion is to put together a simple balance sheet where everybody around the table is asked to list points on both sides: “Tell me what is good about this opportunity; tell me what is bad about it. Do not tell me your judgment yet. I don’t want to know. ” They start the process without having to justify and thereby freeze their opinions and instead are allowed to give their best insights and consider the ideas of others. Not surprisingly, smart people will uncover many of the same issues. But they may weigh them differently depending on their biases.

We do not ask for anyone’s bottom line until everybody has spoken and the full balance sheet is revealed. I have noticed my own judgment often changes as I see the balance sheet fill out. I might have said, “We shouldn’t do the deal for the following three reasons. ” But after creating a balance sheet, I might well look at it and say, “You know, it’s probably worth doing for these two reasons. ” The balance sheet process mitigates a lot of the friction that typically arises when people marshal the facts that support their case while ignoring those that don’t. It also emphasizes to the group that each participant is smart and knowledgeable, that it was a difficult On the cover: Seeing through biases in strategic decisions 51

The balance sheet process allows everyone around the table to give their best insights and consider the ideas of others, without having to justify and thereby freeze their opinions decision, and that there is ample room for the other judgment. By assembling everyone’s insights rather than their conclusions, the discussion can focus on the biases and assumptions that lead to the opinions. An added bonus is that people start to see their own biases. Somebody will stand up and say, “You’re expecting me to say the following three things, and I will. But I’ve also got to tell you about these other four things, which are probably not what you’d expect from me. ” Finally, opinion leaders have less sway because they don’t signal their conclusions too early. Although this may sound tedious and slow, we’re able to move quickly.

One reason is that we never try to achieve perfection—meaning 100 percent certainty—around a decision. We just can’t get there in the timeframe necessary. The corollary is that we assume every decision needs to be tested, measured, and refined. If the test results come back positive, we proceed; if they’re negative, we “course correct” quickly. Create a culture where ‘failure’ is not a wrong answer The book John Mullins and I recently wrote, Getting to Plan B, presents a way of building a culture of good decision making. 4 The very simple premise is that Plan A most often fails, so we need a process by which to methodically test assumptions to get to a better Plan B. The process starts with an cknowledgment that Plan A probably is based upon flawed assumptions, and that certain leap-of-faith questions are fundamental to arriving at a better answer. If we disagree on the decision, it’s very likely that we have different assumptions on those 4 Getting to Plan B: Breaking Through to a Better Business Model, Watertown, MA: Harvard Business Press, 2009. 52 2010 Number 2 critical questions—and we need to decide which assumptions are stronger, yours or mine. You end up teasing apart these assumptions through analogs: someone will say, “Joe did something like this. ” And then someone else will say, “Yes, but Joe’s situation was different for the following reasons. Sally did something like this, and it failed. In that process, you don’t get points for being right about your assumption, and I don’t lose points for being wrong. We both get points for identifying the assumption, working on it, and agreeing that the facts have come in one way or the other. What makes this culturally difficult in larger companies is that there is often a sense that Plan A is going to succeed. It’s well analyzed. It’s vetted. It’s crisp. It looks great on an Excel spreadsheet. It becomes the plan of record to which everybody executes. And the execution of that plan does not usually contemplate testing assumptions on an ongoing basis to permit a course correction. So if the plan is wrong, which it most often is, then it is a total failure.

The work has gone on too long. Too much money has been spent. Too many people have invested their time and attention on it. And careers can be hurt in the process. To create the right culture, you have to make very clear that a wrong answer is not “failure” unless it is ignored or uncorrectable. Intuit, for instance, has found that many early-stage R&D projects went on too long. As in most companies, there was a belief that “we just need to put a little more time and money into these things. ” Within about 90 days after I had explained the Plan B process to Intuit, they had broken a set of projects into smaller hypotheses, put together a dashboard rocess for testing assumptions, and were starting to make go-no-go decisions at each step along the way. They reported back that teams were killing their own projects early because they now had metrics to guide them. And most important, they were not being blamed. Intuit’s culture allows for rapid testing and “failure,” and those who prove responsible and accountable in course correcting are rewarded with new projects. Listen to the little voice I think comfort with uncertainty and ambiguity is an important trait in a leader. That’s not to say that they’re ambiguous or uncertain or unclear, but they’re not hiding behind some notion of black or white.

When somebody’s shutting down conversations because he is uncomfortable with the points of view in the room or with where the decision may be going, it usually leads to a culture where the best ideas no longer come to the top. Now, there are cultures where that does seem to work, but I think those are exceptions. Steve Jobs seems to be able to run Apple exceedingly well in large part because Steve Jobs is an extraordinary On the cover: Seeing through biases in strategic decisions 53 person. But he’s not a guy who tolerates a lot of diversity of opinion. Frankly, few leaders I meet, no matter how important they are in the press or how big their paychecks are, are that comfortable with diversity of opinion.

I love a leader who changes his or her opinion based upon the strength of the arguments around the table. It’s great to see a leader concede that the decision’s a hard one and may have to be retested. It’s great to see a leader who will echo the little voice in the back of the room that has a different point of view—and thereby change the complexion of the discussion. When I went to LucasArts, I can remember sitting down one day with a young woman two levels down in the sales organization. I said, “Do you think we could build our own sales force and distribution here? We’ve been going through distributors for a long time. Our margins are a lot smaller as a result. What do you think? ” She shut the door, looked at me, nd said, “I know that my boss would disagree with me and I know that my peers in marketing disagree with me, but I think we can do it. ” And so we did it, and the company’s gross margin line probably grew fivefold in 12 months—all based upon this one little voice in the back of the room. You’ve got to be able to hear that voice. Copyright © 2010 McKinsey & Company. All rights reserved. We welcome your comments on this article. Please send them to quarterly_comments@mckinsey. com. 54 ‘ Timeliness trumps perfection Anne Mulcahy ’ Anne Mulcahy is chairman and former CEO of Xerox. She is a director on the boards of Catalyst, Citigroup, Johnson & Johnson, and the Washington Post Company, as well as chair of the board of trustees for Save the Children.

When Anne Mulcahy became CEO of Xerox in 2001—as the company teetered on the edge of bankruptcy—she dove in with the confidence and decisiveness that had typified her career to date. But as she began to engineer the company’s dramatic turnaround, something unexpected happened: Mulcahy started hearing rumblings that her leadership style was too decisive. As she recounts, “I got feedback that between my directness and my body language, within three nanoseconds people knew where I stood on everything and lined up to follow, and that if I didn’t work on it, it really would be a problem. ” So Mulcahy listened. “I stopped getting on my feet,” she explains, “and I worked hard at not jumping in, at making people express a point of view. This was the first of many lessons about how to ensure high-quality decision making that Mulcahy would go on to learn during her nine years as CEO. In a recent interview with McKinsey’s Rik Kirkland, she distilled five suggestions for other senior leaders. 55 Cultivate internal critics My own management style probably hasn’t changed much in 20 years, but I learned to compensate for this by building a team that could counter some of my own weaknesses. You need internal critics: people who know what impact you’re having and who have the courage to give you that feedback. I learned how to groom those critics early on, and that was really, really useful.

This requires a certain comfort with confrontation, though, so it’s a skill that has to be developed. I started making a point of saying, “All right, John-Noel, what are you thinking? I need to hear. ” And this started to demonstrate that even if I did show my colors quickly, they could still take me on and I could still change my mind. The decisions that come out of allowing people to have different views—and treasuring the diversity of those views—are often harder to implement than what comes out of consensus decision making, but they’re also better. Force tough people choices If you’re sitting around the table with the wrong group of people, no process is going to drive good decision making.

You need to lead with people decisions first. One of the easiest mistakes you can make is to compromise on people. It’s very easy to close your eyes and say warm bodies are better than no bodies. The way to counter this bias is to introduce a “forced choice” process. What I mean by this is, you need a disciplined process for forcing discussion about a set of candidates and a position. At Xerox, we developed an HR process that required three candidates for every job. 56 2010 Number 2 We also established a group-assessment process, which helped us avoid what I call lazy people decisions, that is, biases against confrontation that could have marginalized the effectiveness of our team.

You need to look for people who can strike the hard balance between courage and learning—people who have audacity in their convictions and know when to be unyielding but who are also good listeners and capable of adapting. That is the single most important leadership trait, outside of pure competence. However you do it, you need to set a context for choice. Once you’ve done that, you must make sure you understand your own criteria for what first-class talent means, and you need to hold yourself accountable for creating a dialogue about it in a very honest way. Force tough R&D choices One of the rules of the road should be never to evaluate R&D programs individually. You should always decide on them within the context of an R&D portfolio. There needs to be an “is this better than that? conversation—no one should get to personally champion his program in a vacuum. Any single idea can look great in isolation. The portfolio process, like the “forced choice” process for people decisions, is really important because it gives you choices in context. It also takes some of the difficulty of killing individual projects out of the way. And it helps you hold yourself accountable for the full resourcing of the idea. If you decide to invest in a growth opportunity, it’s because you’ve spent a lot of time making sure that it’s resourced properly, that you’ve got the right skill sets to execute it, and that you’re not just saying, “Sure, go off and do it” before you’ve thought through all those considerations.

This process was particularly important for us at Xerox. We kept an investment going for ten years in a technology called Solid Ink, which just came to market this year. We did this by putting a fence around it and a few other strategic priorities that we knew we wanted to protect. Portfolio decision making helped us drive those priorities forward even though most of the people who made the decisions wouldn’t still be in their jobs to see the returns. Know when to let go One of the most important types of decision making is deciding what you are not going to do, what you need to eliminate in order to make room for strategic investments. This could mean shutting down a program.

It could mean outsourcing part of the business. These are often the hardest decisions to make, and the ones that don’t get nearly enough focus. Making a decision not to fund a new project is not painful. Making a decision to take out a historical program or investment is. It means taking out people and competencies and expertise. That’s much, much harder. On the cover: Seeing through biases in strategic decisions 57 The most difficult decisions are these legacy ones—the historical investments, the things that are just easier to chip away at rather than make a tough decision. This is where we make the most compromises— at the expense of our focus.

A great example from Xerox was that it took too long to move from legacy investments in black-and-white imaging to future strategic investments in color and services. An approach that can help this process involves establishing a decision framework (one akin to a zero-based budgeting philosophy) that says there’s no preconceived commitment to a legacy business. It will get discussed in the context of opportunities for future investment like all the rest. But to make this decision process work, you need to make sure to create a balance between the people who can champion and advocate the future and those who own—and are very invested in—the past. Strike the right risk balance

Decisiveness is about timeliness. And timeliness trumps perfection. The most damaging decisions are the missed opportunities, the decisions that didn’t get made in time. If you’re creating a category of bad decisions you’ve made, you need to include with it all the decisions you didn’t get to make because you missed the window of time that existed to take advantage of an opportunity. These days, everyone is risk averse. Unfortunately, people define risk as something you avoid rather than something you take. But taking risks is critical to your decision-making effectiveness and growth, and most companies have taken a large step backwards because of the current climate.

I was CEO of Xerox for five years before we really got back into the acquisition market, even though we knew we needed to acquire some things rather than develop them internally. But we got very conservative, very risk averse, and also too data driven. By the time we would reach a decision that some technology was going to be a home run, it had either already been bought or was so expensive we couldn’t afford it. Decisions have shelf lives, so you really need to put tight timeframes on your process. I would so much rather live with the outcome of making a few bad decisions than miss a boatload of good ones. Some of it flies in the face of good process and just requires good gut.

So when trying to take bias out of decision making, you need to be really cautious not to take instinct, courage, and gut out as well. Copyright © 2010 McKinsey & Company. All rights reserved. We welcome your comments on this article. Please send them to quarterly_comments@mckinsey. com. 58 When can you trust your gut? 59 Nobel laureate Daniel Kahneman and psychologist Gary Klein debate the power and perils of intuition for senior executives. For two scholars representing opposing schools of thought, Daniel Kahneman and Gary Klein find a surprising amount of common ground. Kahneman, a psychologist, won the Nobel Prize in economics in 2002 for prospect theory, which helps explain the sometimes counterintuitive choices people make under uncertainty.

Klein, a senior scientist at MacroCognition, has focused on the power of intuition to support good decision making in high-pressure environments, such as firefighting and intensive-care units. In a September 2009 American Psychology article titled “Conditions for intuitive expertise: A failure to disagree,” Kahneman and Klein debated the circumstances in which intuition would yield good decision making. In this interview with Olivier Sibony, a director in McKinsey’s Brussels office, and Dan Lovallo, a professor at the University of Sydney and an adviser to McKinsey, Kahneman and Klein explore the power and perils of intuition for senior executives. 60 2010 Number 2 “ My general view would be that you should not take your intuitions at face value; overconfidence is a powerful source of illusions Daniel Kahneman is a Nobel laureate nd a professor emeritus of psychology and public affairs at Princeton University’s Woodrow Wilson School. He is also a fellow at the Hebrew University of Jerusalem and a Gallup senior scientist. ” The Quarterly: In your recent American Psychology article, you asked a question that should be interesting to just about all executives: “Under what conditions are the intuitions of professionals worthy of trust? ” What’s your answer? When can executives trust their guts? Gary Klein: It depends on what you mean by “trust. ” If you mean, “My gut feeling is telling me this; therefore I can act on it and I don’t have to worry,” we say you should never trust your gut.

You need to take your gut feeling as an important data point, but then you have to consciously and deliberately evaluate it, to see if it makes sense in this context. You need strategies that help rule things out. That’s the opposite of saying, “This is what my gut is telling me; let me gather information to confirm it. ” Daniel Kahneman: There are some conditions where you have to trust your intuition. When you are under time pressure for a decision, you need to follow intuition. My general view, though, would be that you should not take your intuitions at face value. Overconfidence is a powerful source of illusions, primarily determined by the quality

On the cover: Seeing through biases in strategic decisions 61 and coherence of the story that you can construct, not by its validity. If people can construct a simple and coherent story, they will feel confident regardless of how well grounded it is in reality. The Quarterly: Is intuition more reliable under certain conditions? Gary Klein: We identified two. First, there needs to be a certain structure to a situation, a certain predictability that allows you to have a basis for the intuition. If a situation is very, very turbulent, we say it has low validity, and there’s no basis for intuition. For example, you shouldn’t trust the judgments of stock brokers picking individual stocks.

The second factor is whether decision makers have a chance to get feedback on their judgments, so that they can strengthen them and gain expertise. If those criteria aren’t met, then intuitions aren’t going to be trustworthy. Most corporate decisions aren’t going to meet the test of high validity. But they’re going to be way above the low-validity situations that we worry about. Many business intuitions and expertise are going to be valuable; they are telling you something useful, and you want to take advantage of them. Daniel Kahneman: This is an area of difference between Gary and me. I would be wary of experts’ intuition, except when they deal with something that they have dealt with a lot in the past. Surgeons, for example, do many operations of a given kind, and they learn what “

Many business intuitions and expertise are going to be valuable; they are telling you something useful, and you want to take advantage of them ” Gary Klein is a cognitive psychologist and senior scientist at MacroCognition. He is the author of Sources of Power: How People Make Decisions, The Power of Intuition, and Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making. 62 2010 Number 2 problems they’re going to encounter. But when problems are unique, or fairly unique, then I would be less trusting of intuition than Gary is. One of the problems with expertise is that people have it in some domains and not in others. So experts don’t know exactly where the boundaries of their expertise are.

The Quarterly: Many executives would argue that major strategic decisions, such as market entry, M&A, or R&D investments, take place in environments where their experience counts—what you might call high-validity environments. Are they right? Gary Klein: None of those really involve high-validity environments, but there’s enough structure for executives to listen to their intuitions. I’d like to see a mental simulation that involves looking at ways each of the options could play out or imagining ways that they could go sour, as well as discovering why people are excited about them. Daniel Kahneman: In strategic decisions, I’d be really concerned about overconfidence.

There are often entire aspects of the problem that you can’t see—for example, am I ignoring what competitors

Cite this page

Mckinsey Quarterly on Behavioral Strategy Spring 2010. (2018, Jan 02). Retrieved from https://paperap.com/paper-on-mckinsey-quarterly-on-behavioral-strategy-spring-2010-1153/

Mckinsey Quarterly on Behavioral Strategy Spring 2010
Let’s chat?  We're online 24/7