How many times can a man turn his head,
and pretend that he just doesn’t see?
BOB DYLAN, “BLOWIN’ IN THE WIND”
During the first few nights after 9/11, I awoke abruptly with an image of the second airplane veering into the second tower. Even given the magnitude of the tragedy, this was strange for me. The stress of life seldom disrupts my rest; I normally sleep well and rarely remember my dreams. Now I was waking up with the same frightening image many nights in a row, and I couldn’t fall back to sleep—again, rare for me. So I gave up, headed to my home office in the wee hours, and thought about what social scientists like me know about what had just happened to the United States. After a few nights of this pattern, I had a vague notion that 9/11 should have been anticipated—and prevented. Here are the core pieces of evidence I jotted down during those early mornings:
• The U.S. government knew that terrorists were willing to become martyrs for their cause and that their hatred toward the United States was increasing.
• In 1993 terrorists had bombed the World Trade Center.
• In 1994 terrorists had hijacked an Air France plane and made an aborted attempt to turn the plane into a missile aimed at the Eiffel Tower.
• Also in 1994 terrorists had attempted to simultaneously hijack twelve U.S. commercial airplanes in Asia.
• Airline passengers know how easy it is to board an airplane with items, such as small knives, that can be used as weapons.
Soon after collecting these thoughts, I was having coffee with Michael Watkins, then my Harvard Business School colleague, and mentioned my analysis of 9/11. Michael asked me to follow him into his office, where he pulled out a file labeled “Predictable Surprises,” which became the title of our 2003 book. This work focused on how individuals and organizations can learn to recognize, prioritize, and mobilize action to avoid serious predictable surprises. In the chapter of our book that analyzed 9/11 as a predictable surprise, we anticipated the eventual conclusion of the 9/11 Commission: “The 9/11 attacks were a shock, but they should not have come as a surprise.”1
When Michael and I wrote Predictable Surprises, I was a well-known scholar and teacher in the decision-making field. I had written its leading textbook and generally thought that I made pretty good decisions in life. In 2013 I was chosen to be codirector with David Gergen of the Harvard Kennedy School Center for Public Leadership. A strong case can be made that, at its heart, leadership arises from effective decision making by individuals, teams, and organizations. That connection had long been on my mind but became all the more acute when writing about predictable surprises. I was beginning to realize that there was a serious gap in my understanding of human decision-making failures, a gap that also existed in the scientific and managerial literature on decision making. It was becoming increasingly clear to me that terrible things happen when our leaders fail to think about data that are outside their typical focus.
Two other episodes from my life drove home the truisms that all of us are prone to miss essential facts and that the benefits of widening our area of focus can be profound. First, in 2003 I attended a talk by another Harvard colleague, Mahzarin Banaji, where she showed a video—which you may have seen—made by psychologist Ulric Neisser in the 1970s. Before starting the eighteen-second video, Mahzarin told the audience that they would see two visually superimposed groups of three players passing a basketball. One trio wore white shirts, and the other trio wore dark shirts. Our task was to count the number of passes made by the trio wearing white shirts. The dual video, as well as the grainy nature of the film, made the task moderately complex. Before reading on, feel free to try to accurately count the passes among players wearing the white shirts in the Neisser video at http://www.people.hbs.edu/mbazerman/blindspots-ethics/neisser.html.
I counted the passes among the players with the white shirts, feeling confident. I am pretty good at focusing. When Mahzarin confirmed that the number of passes was eleven, the same number I had counted, I felt proud, mentally patting myself on the back. Then she asked the audience of a few hundred if they had seen anything unusual in the video. One woman in the back of the room mentioned “a woman with an umbrella,” who she claimed had walked in front of the players. The comment seemed truly bizarre, and I was even more surprised when a few others confirmed the woman’s account.
Mahzarin then replayed the video. Sure enough, there was a woman who clearly walked through the group of basketball players carrying an open umbrella. She is very easy to spot if you aren’t preoccupied with counting passes. (If you watched the video and don’t believe she was there, look again.) There are many variations of this video (in the most famous version, a person in a gorilla suit replaces the woman with the umbrella), and psychologists Chris Chabris and Dan Simons have even written a book entitled The Invisible Gorilla that features their fine work on the gorilla version of this task.
My failure to see the woman with the umbrella was common (somewhere between 79 and 97 percent of audience members do not see her) and now easily explained by the psychological literature, yet I still found it amazing. When I show this video in classrooms, my students, like me, focus on counting and generally miss this very obvious information in their visual world. Years after I saw the video for the first time, I remain obsessed by my failure to see the woman with the umbrella, and this obsession has organized my research and teaching over the past decade.
Of course, my success in life does not depend on seeing women with umbrellas in trick problems. A carefully developed ability to focus is more useful than not. Yet I wondered, is there a price to this focus? Beyond the realm of visual tricks, does focusing inhibit our ability to notice critical information? After we have learned to spot the umbrella or gorilla, isn’t there something more to be learned, namely the habit of spotting all (or at least more) of the metaphorical umbrellas and gorillas?
These questions lead to the third episode in my life that crystallized my thinking about noticing. In 2005 a Fortune 20 company hired me to create a course on decision making and negotiation in diplomatic contexts for the firm’s top seventy-five executives. The class was run in small groups, about fifteen executives per session. We built it around case studies of specific challenges my client had faced involving complex negotiations in the recent past. In the hour before the start of the first session, I was introduced to three distinguished-looking individuals who were referred to as my “special advisers”; I was told that they had expertise I could draw on during the class. I was confused, so I asked one of the senior staff members who had been involved in creating the course with me to explain what was going on. I learned that two of the three advisers were former ambassadors who had served in the country where the corporation was located and in the countries represented in the case studies that we would be analyzing. The third was an extremely high-level former intelligence official. I remember thinking that this would have been good information to know before the class was about to begin.
Making matters more complex, during class the three diplomats seemed to feel quite free to interrupt me on a regular basis. Even worse, their comments didn’t have much to do with where the class was headed, at least according to my plan. To be frank, my initial reaction was irritation. But as the first half-day of the program progressed, I began to develop a deep appreciation of their comments. They did make sense, I realized, and they offered unique insights. What made their comments unique was that they tended to lie not only outside the focus of the corporate executives but also outside my focus. These diplomats thought outside the box, systematically removing the blinders that confronted the rest of us. Consistently the executives and I were thinking one step ahead of the problem at hand and doing a fine job of working through the data that we defined as relevant. Meanwhile the diplomats were thinking three or four steps ahead and, in the process, including more diverse data for consideration and developing interesting and important insights. They tended to think intuitively about how the results of negotiations with one country would affect the decisions and behavior of neighboring countries.
Recalling my failure to see the woman with the umbrella, I realized that I was very good at working with the data in front of me, but not so good at noticing additional information that would allow me to better achieve my real objectives at work and in other spheres on my life. I finally came to realize that the diplomats were capable of expanding their awareness beyond common bounds—a skill that might benefit all of us, particularly those charged with leading others to decisions and actions. In the process of teaching this firm’s executives, I developed an appreciation of a new and different question for my research: Are we capable of developing skills that can overcome the natural bounds of human awareness? The answer, which I explain in this book, is yes.
As these episodes suggest, this book is rooted in my own experience. It is about the failure to notice: a failure that leads to poor personal decisions, organizational crises, and societal disasters. The Power of Noticing details each of these, highlighting recent research developments in our awareness of information that people commonly ignore. Generalizing from my own experience and my research over the past dozen years, I have created a blueprint that can help all of us notice critical information that we otherwise too easily ignore.
In his best-selling book from 2011, Thinking, Fast and Slow, Nobel laureate Daniel Kahneman discusses Stanovich and West’s distinction between System 1 and System 2 thinking.2 System 1 is our intuitive system: it is quick, automatic, effortless, implicit, and emotional. Most of our decisions occur in System 1. By contrast, System 2 thinking is slower and more conscious, effortful, explicit, and logical. My colleague Dolly Chugh of New York University notes that the frantic pace of managerial life requires that executives typically rely on System 1 thinking. Readers of this book doubtless are busy people who depend on System 1 when making many decisions. Unfortunately we are generally more affected by biases that restrict our awareness when we rely on System 1 thinking than when we use System 2 thinking.
Noticing important information in contexts where many people do not is generally a System 2 process. Similarly the nature of the logic that game theory encourages is System 2 logic. It requires that we step back and analyze the situation, think one or more steps ahead, and imagine how others will respond to our decisions—processes that our System 1 intuition typically fails to do adequately. Thus System 2 thinking and game theory are broadly compatible with noticing. The Power of Noticing will help you rely more often on System 2 thinking when making important judgments and decisions. When you do so, you will find yourself noticing more pertinent information from your environment than you would have otherwise. Noticing what is not immediately in front of you is often counterintuitive and the province of System 2. Here, then, is the purpose and promise of this book: your broadened perspective as a result of System 2 thinking will guide you toward more effective decisions and fewer disappointments.
THE BROADER ARGUMENT: OUR FAILURE TO NOTICE
The role of noticing is deeply rooted in the rapidly evolving field of behavioral decision research, now popularized through such acclaimed books as Nudge; Thinking, Fast and Slow; Predictably Irrational; and others. It has diffused to a number of other fields, including behavioral economics, behavioral finance, behavioral marketing, negotiation, and behavioral law. The field is rooted in Herbert Simon’s concept of “bounded rationality” and in Daniel Kahneman and Amos Tversky’s work on the systematic and predictable biases that affect even the best and brightest human beings. (Simon’s work helped earn him the 1978 Nobel Prize in Economics, which Kahneman received in 2002; he would have shared it with his research partner had Tversky lived.) Essentially Kahneman and Tversky created a revolution against the standard economic model, which historically assumed that humans were perfectly rational.
This literature is the foundation upon which I have based my own work over the past thirty years. I have taught decision-making courses at the Kellogg Graduate School of Management at Northwestern University and the Harvard Business School, and I am partially responsible for bringing the perspective of behavioral decision research to negotiation and to the area of behavioral ethics. Yet the concept of bounded rationality and the influential field of behavioral economics have largely defined problems according to how we misuse information that is right in front of us. By contrast, noticing concerns our bounded awareness, or the systematic and predictable ways we fail to see or seek out critical information readily available in our environment.
In Thinking, Fast and Slow, Kahneman does touch on the issue of noticing, explaining that people jump to conclusions based on limited information. He introduces the acronym WYSIATI to describe decision making that is based on the faulty assumption that “what you see is all there is.” The Power of Noticing addresses this limitation in human thinking, identifies what information we do not see or notice, and describes how we can use this knowledge to seek the information that will be most useful for making great decisions. While I agree with Kahneman’s description of how humans act, I want leaders to realize that “what you see is not all there is” (WYSINATI) and to identify when and how to obtain the missing information.
The need to overcome this limitation is everywhere evident. A slew of recent crises occurred not because people misused information but because everyone, and most crucially the leaders charged with solving or preventing problems, missed often readily available information:
• Many people failed to notice that obvious data suggested it was too cold to safely launch the Challenger space shuttle.
• Many overlooked the fact that Enron’s financial reports were fraudulent.
• Many did not recognize that Bernard Madoff’s claimed investment returns were impossible.
• Many at Penn State University turned a blind eye to the abuse suffered by children under their watch.
• Few foresaw that the U.S. housing market could trigger a global financial crisis.
These crises can be explained by the common failure of even very smart people to notice important information.
The Power of Noticing explains many failures in contemporary society that cause us to wonder, How could that have happened? and Why didn’t I see that coming? I document a decade of research showing that even successful people fail to notice critical and readily available information in their environment due to the human tendency to wear blinders that focus us on a limited set of information. Seeing this additional information is essential to our success. In the future it will prove a defining quality of leadership. Moreover we don’t need to give up the benefits of focusing to notice additional critical information. This book will help you recognize when to seek more useful information and apply it to your decisions. It will provide you with the tools you need to open your eyes and truly notice for the first time—and for the rest of your life.
What the Best Leaders See
The Power of Noticing
What the Best Leaders See
Imagine your advantage in negotiations, decision-making, and leadership if you could teach yourself to see, and evaluate, information that others overlook. The Power of Noticing provides the blueprint for accomplishing precisely that. Max Bazerman, an expert in the field of applied behavioral psychology, draws on three decades of research and his experience instructing Harvard Business School MBAs and corporate executives to teach you how to notice and act on information that may not be immediately obvious.
Drawing on a wealth of real-world examples, from the Challenger Space Shuttle disaster to Bernie Madoff’s Ponzi scheme, Bazerman diagnoses what information went ignored in these situations, and why. Using many of the same case studies and thought experiments designed in his executive MBA classes, he challenges readers to explore their cognitive blind spots, identify any salient details they are programmed to miss, and then take steps to ensure it won’t happen again. While many bestselling business books have explained how susceptible to manipulation our irrational cognitive blindspots make us, Bazerman helps you avoid the habits that lead to poor decisions and ineffective leadership in the first place. His book provides a step-by-step guide to breaking bad habits and spotting the hidden details that will change your decision-making and leadership skills for the better, teaching you to: pay attention to what didn’t happen; acknowledge self-interest; invent the third choice; and realize that what you see is not all there is.
With The Power of Noticing at your side, you can learn how to notice what others miss, make better decisions, and lead more successfully.