So far we've talked about how to reduce reactants, ease endowment, shrink distance, and alleviate uncertainty. Now we'll talk about the last barrier we have to mitigate, and that's corroborating evidence. How much do you like the word juvalamu? What about the word chakaka? You might like juvalamu more, most people do, or you might prefer the second one. But more importantly, you probably don't care that much about either. And opinions toward these nonsense words are examples of what are called weak attitudes, preferences or opinions that people don't find very important haven't received much thought, or they're relatively easy to change. If I told you that juvalamu, that word I gave you earlier, was the name of a dictator who murdered his political enemies, you probably wouldn't like that word anymore. That one piece of information would get you to enough to change your view. How do you feel about pine trees, prime numbers, serif versus sans serif font? For most people, these are examples of weakly held attitudes. You have an opinion, but it's not that important to you. It's relatively easy to change. Contrast that with how you feel about different political parties, your favorite sports team, how you feel about your favorite brand of beer or abortion. These are examples of strong attitudes, high involvement issues, topics or preferences you thought a lot about and hold great moral conviction. Not surprisingly, strong attitudes are much more resistant to change. Imagine an article suggests your favorite celebrity said something racist. What's your first reaction? Probably one of disbelief or denial. There's no way that person could be racist, right? Unlike hearing that juvalama was a dictator, our anti-persuasion radar rushes to protect our strong beliefs. Rather than giving up or changing our mind, we discount that information that goes against existing views, picking it apart rather than revising our perspective. Just like a really bad headache need stronger medicine, some issues, products, and behaviors need more before people will change, more proof or more evidence is required. One recommendation is enough to get you to check out a new website. But what about putting solar panels on your house or ordering groceries online? Probably not, for stronger attitudes there's a higher threshold for changing minds. More is needed, more information, more texture, or more certainty, more proof before people will switch. Changing minds is a little bit like trying to lift something on the other end of a seesaw. How much weight or proof you need depends on how heavy the thing is you're trying to move. If you're trying to lift a pebble, you don't need that much. A little bit of evidence is enough to move it right away, change happens. But if you're trying to move a boulder much more effort is needed, more proof is required before people will change. The question then becomes, well, how can we provide more proof? When faced with a boulder the most common responses is to turn up the juice, right? Try a little harder, just push people a little more to try to convince them that a course of action is the right way to go. As the proverb says, if first you don't succeed, try, try again. Spouse not interested in the more expensive vacation package? Try a different appeal. Client's still wavering on whether to make an order? Call again in a week. And indeed, sometimes following up works. Sometimes you pitch people again and they change their mind. You give them a bit of a different information and they change their perspective, but in most cases it fails. Most cases a varaiation of a different pitch doesn't work because people know that you're trying to change their mind. But there's also another reason that trying again doesn't work, another reason why you saying something different a second time doesn't change minds, and that's what called the translation problem. Imagine someone comes in the office Monday morning and they tell you they watched an amazing show over the weekend. The dialogue is sharp, the plot is gripping, and the acting is superb. They just love that show and they think you'll like it too. In a sense, they've just added some weight to that other end of your seesaw. And depending on the threshold for change or how strong you feel about television shows, that evidence is either enough or not enough to get you to change. But imagine it's not enough. Imagine if your preferences are more like a boulder, nothing changes. Now Thursday rolls around, they've watched another episode, and they continue to be enthusiastic. They tell you how great that second episode is. In some sense, them telling you again has added a little bit more proof, but in a different sense knowing they like the second episode actually doesn't provide that much additional information. Because when someone endorses or recommend something they're always what's called a translation problem, a puzzle. It could mean that that show is really great, but it could also just mean that they like lots of shows. Does the fact that they said something say something about them or does it say something about the thing being recommended? And even more importantly, they might recommend it but does that mean I'll like it? How informed is their reaction for my own? Now if they were another you, we wouldn't have this problem. If another you liked a television show, you'd probably like it too. If another you liked a certain service, you'd probably like it as well. If there was a perfect doppelganger that was exactly like you, knowing that they liked something would give you a lot of evidence that you would like it as well. But in the absence of such a perfect doppelganger, we have to make inferences. If that person liked it, how much information does that provide about whether I would like it or not? And so if one person suggests or does something, it's hard to translate. It's hard to know if their opinion is diagnostic of ours, what their reaction means from my own. But if multiple sources say or do the same thing, it's harder not to listen because now there's corroborating evidence, reinforcement, multiple sources concur. They have the same view, response, or preference, and this consistency means it's much more likely that you'll feel the same way. It's easy to discount one person, but it's hard to discount a chorus. One other doctor prescribing new treatment, well, maybe a sales rep came by, or maybe they have a certain type of patient. But multiple doctors prescribing exactly the same thing would suggest if you're a doctor it might be worth taking a look. Because if multiple people are saying or doing the same thing, it's hard to argue that they're wrong. It's hard to argue that the thing they're suggesting isn't any good. It adds credibility and legitimacy. One person, maybe they have quirky tastes. Two people, five people, ten people, the more sources that are speaking in concert, the more corroborating evidence it provides. As the old adage goes, if one person says you have a tail you laugh and you think they're crazy. But if three people say it, if five people say, well, then you turn around and take a look. More sources, more people saying or doing the same thing, can provide more proof. But who those sources are and when they share their perspective plays an important role. In particular, when finding corroborating evidence it's important to consider who, when, and how. Who else to involve, or what sources are most impactful, when to space corroborating evidence out over time, and how to best deploy scarce resources when trying to change minds on a larger scale.