Picture this. You’re at a restaurant that has two particularly popular dishes on the menu: spaghetti carbonara and seared scallops. You talk over your choices with the waiter before finally deciding that you’re really in the mood for some seafood and order the scallops. 15 minutes pass and the waiter brings you your meal. Yum, carbonara! You were really looking forward to that pancetta. Wait, what happened? That kind of mind game might be more possible than it seems. Let us introduce you to choice blindness.
To be fair, we’re guessing that you would notice if a waiter brought you the wrong dish (although maybe you’d be too polite to mention it). But in a series of studies led by Peter Johansson from Lund University, researchers found that it was surprisingly easy to convince people that they had made a different decision than the one they made in reality. Ready to have your mind blown? Here’s how the first study worked.
Participants were given a choice of two different faces and asked to decide which of the two was more attractive. Then they were shown their choices — but unbeknownst to them, the researchers would sometimes perform a sleight-of-hand card trick and switch out the choice they actually made for the choice they rejected. You might think that would be an easy thing to notice, but participants commented on the switch less than a third of the time. Even more astonishingly, the switch would impact their subsequent choices. As the choices went on, the participants were shown some of the same faces again and asked them to compare them with other choices. Those who had experienced the old switcharoo actually chose the face that had been swapped out more often than not, suggesting that they had actually been convinced that the face they didn’t choose was more attractive after all.
Judging a beauty contest is one thing, but surely this phenomenon can’t be replicated with deeply held political or moral beliefs, right? You probably see where this is going. In a 2012 study, the researchers recruited people they encountered randomly in the park and had them fill out a survey where they ranked how strongly they agreed or disagreed with certain moral claims, such as “It is more important for a society to promote the welfare of the citizens than to protect their personal integrity.” Moments later, the researchers asked them to review, discuss, and justify their position — even though, unbeknownst to them, the researchers had swapped the participants’ original positions for their opposites (in this case, “It is more important for a society to promote the personal integrity of the citizens than to protect their welfare.”)
Once again, more than two-thirds of the participants failed to notice that such a switch had been made. Furthermore, the researchers made a point of asking each recruit questions like “so you don’t agree that [statement]?” or “so you do agree that [statement]?” to make sure they fully understood the new claim that they were putting forward. It didn’t matter: Those people who were ambushed with a swapped-out set of political or moral beliefs were completely capable of justifying their response despite the fact that moments earlier they professed to believe something completely different. Go easy on yourself — your beliefs aren’t as iron-clad as you think they are.
Check out my related post: Do you have the long view?