When trying to change minds, we often want big change right away. We want a big raise now, we want attracters to immediately become supporters. We think if we just give people more information, they'll come around. We just share more evidence, lists more reasons or put together the right PowerPoint deck, people will switch. But just as often, this blows up in our faces. Rather than shifting their perspective, people dig in their heels. Rather than changing, they become even more convinced they're right. So to understand why, we have to understand something called the confirmation bias. A number of decades ago, researchers performed an interesting study around a football game. After watching a game between Princeton and Dartmouth, students from each school we're asked several follow-up questions. It was a rough game, there are many penalties called on both teams, Princeton's star tailback had a broken nose and a mild concussion, Dartmouth's quarterback had a broken leg after being tackled in the backfield. Princeton won, tempers flaired on both sides, and there were heated discussions about who is at fault. So when asked after the game though how students saw the game, how they saw it depended entirely on which side they supported. Princeton students, well, they thought it was Dartmouth's fault. Dartmouth started the rough play and committed twice as many penalties. Dartmouth students on the other side thought that both sides had been rough and then Princeton was the ones who'd caused the penalties. Exact same game, two very different perspectives just based on the lens with people saw that game through. These biases even shape seemingly objective things like scientific research. Professors gave people information about two studies examining the efficacy of the death penalty. One study suggested that the death penalty worked, it worked as a deterrent. It compared murder rates from the year before and after the adoption of capital punishment in 14 states and found that in most of the states, murder rates were lower after the adoption of the penalty. Another study though showed sort of the opposite. The findings suggests the death penalty wasn't much of a deterrent. It compared murder rates in 10 different pairs of neighboring states and found that most of the neighboring states, murder rates were higher once they used capital punishment. So they gave these two studies, one supporting capital punishment and one against capital punishment, to a variety of participants. They were given information about how the research was conducted, procedural details about the methods and so on. Then those people were asked, well, which one is right? You've got two studies in front of you, how convincing do you find the studies and how good do you think the quality of the research is whether each study was well done or poorly conducted. While it makes sense that your team loyalties, which team you support might affect how you see a game, we'd hope that responses to scientific research would be different. We'd would hope they'd be more objective, particularly in such an important domain like the death penalty when lives are on the line, really people should see the facts. But it turned out how people perceive the seemingly objective scientific research, these results, depended entirely on their position. The people that supported the death penalty thought the study that supported the death penalty was more convincing, and people that thought the death penalty was bad thought the exact opposite. The same held for how they thought about the study itself. People that supported the death penalty thought the study that suggested it worked was well thought out and gathered the data properly. Opponents thought the exact opposite. They thought the evidence was relatively meaningless without data comparing how crime rate went up overall across those different years. People picked at those different facts. People interpreted them to fit their own way of seeing the world. The decision to accept or look for flaws depended on their existing beliefs. No wonder then that one person's truth is another's fake news. Whether information seem true or false depends a lot on our position, where we start with. So rather than uniting opposing sides, exposure to evidence sometimes just widens the gap. Taken together, this tendency to look for and process information in a way that confirms what we already think has been called the confirmation bias, and no one is immune. The bias shapes the treatment that doctors prescribe and the decisions that jurors make. It impacts the strategies that investors follow and what actions leaders take. Even the directions research scientists pursue and employees internalize, all these things are shaped by what existing beliefs are. As one famous psychologists noted, when we examine evidence relevance to a given belief, we're inclined to see it the way we want to see it. When it supports our conclusion, we go along with it. But when it doesn't, we ask ourselves, "Must I believe this?" So it turns out that people have a region or range of beliefs around things they'll consider. They won't only consider where they are already but they'll consider some things around it. A good analogy or way to think about this is a football field. Imagine a football field broken up into different hash marks along the length of the field with different ends on the opposing side. Think about politics for example, one end zone might be liberal and the other end zone might be very conservative and along the way are different degrees of being liberal and conservative. People extremely liberal on one end, people extremely conservative on the other, and halfway on midfield are people who are on the fence. A little to each side are people that are moderately but not extremely one way or the other. Well, people's political views put them as a spot on the field, but around that spot where they are on the field are different zones. Around the belief they have already and thinking about beliefs they'll consider is called the zone of acceptance. It includes a viewpoint people agree with the most, along with the range of other viewpoints they could see potentially supporting. Beyond the safe area though, is something called the region of rejection. These are the perspectives that people strongly disagree with or actively reject as wrong. So take someone who's views put them right on midfield for example, a moderate. That might be their current opinion but they might be willing to consider things 10 yards in one direction or the other. That's the region of acceptance, their zone of consideration, but beyond that is the region of rejection. Someone else might be on the 20 yard line and be willing to consider every way to their end zone, all the way up to mid-field. So different positions on the field but also different zones of acceptance and regions of rejection. These different zones determine whether incoming information succeeds or fails. Incoming information is not just compared with one's existing view, it also depends on whether it falls in that zone or not. Even if it's not exactly where people are already, if it's close enough, it's within that zone of acceptance, then the information works as intended. People change their mind and they move a little bit in that desired direction. But if information is too far away in that region of rejection, it fails. Not only does it fail to persuade, it often backfires. People change their mind in the opposite direction. They come even more certain of their initial views or their initial beliefs. People are willing to consider different perspectives up to a certain point, but beyond that things get ignored. These biases make changing minds all the more difficult. Not only do people have to be willing to change, but they have to be willing to listen to information that might even open them up to that possibility. So the question then becomes, how do we change folks? Given this challenge, how do we combat these biases? How can we avoid the region of rejection and encourage people to actually consider what we have to say? Well, we'll talk about three ways to mitigate distance. We'll talk about finding the movable middle, asking for less, and switching the field to find an unsticking point. Each of these ideas relies on the same core notion. When we ask for too much, when we go from too far from where people are, they won't listen. But if we ask something close enough or not too far away, they'll be more willing to move and we can use that to move them multiple times.