When a courtroom judge instructs jury members to ignore potentially relevant testimony as they deliberate, can they? Chicago Booth’s Berkeley J. Dietvorst and Ramon Llull University’s Uri Simonsohn find that people can often ignore information—if they choose to.
This argument challenges a host of psychological research suggesting that people can’t ignore information after they have been exposed to it. The idea of hindsight bias, for example, suggests that once people learn the answer to a question, they tend to think that they knew it all along—and they predict that it is also more obvious to others.
However, this past body of work doesn’t make clear whether people are unable to or choose not to disregard what they’ve learned. In prior experimental setups, researchers presented participants with information the researchers thought was irrelevant and should be ignored, assuming that the participants would also recognize the information as irrelevant. If participants used the information, the researchers inferred that it was because they were unwillingly affected by it.
Dietvorst and Simonsohn returned to some of those setups to reevaluate that premise. In their first experiment, they had participants carry out a version of the seminal study on hindsight bias first conducted by Carnegie Mellon’s Baruch Fischhoff, in which they gave people the answer to a multiple-choice question about a historical event, a conflict in Nepal between the Gurkha and the British. (Gurkhas are Nepalese soldiers in the British or Indian armies.)
The researchers then asked participants to estimate how knowing the answer would affect their ability to predict how others who lacked this information would answer the same question. One might think that knowing the answer shouldn’t affect participants’ predictions at all—and that was a presumption made in the original research. However, the majority of participants, 73 percent, told Dietvorst and Simonsohn that knowing the answer would improve their accuracy, which suggests that people may attend to the information intentionally, violating the assumption of past work.
“People’s use of information may be a lot more intentional than past research made it out to be.”
The next experiment built upon this finding by asking participants how two fictional people (Person A and Person B) would do at rating other people’s accuracy at projecting a company’s earnings, as was done in the seminal work on the curse of knowledge. Person A had been given more information about the company than Person B had, and once again the original researchers had assumed the additional information to be irrelevant to the prediction at hand. Yet participants in Dietvorst and Simonsohn’s study overwhelmingly said that Person A would be more accurate at estimating other people’s accuracy. The results again call into question a core premise of these earlier studies, according to the researchers.
In a third experiment, they looked at how common the desire is to use additional information, finding that hindsight bias is largely driven by a subset of people who want to use this information. Participants were asked whether they would use a correct answer if given it. Then some participants were randomly assigned to learn the answer—and only those who had indicated they would use the answer displayed hindsight bias.
“People’s use of information may be a lot more intentional than past research made it out to be,” says Dietvorst.
But he and Simonsohn also find that while some people intentionally use information they’re asked to ignore, it’s possible to convince them to do otherwise. The researchers devised a mock trial, telling their participant jurors that a piece of evidence was deemed inadmissible because it had been illegally obtained. The mock jurors were more likely to ignore the evidence when the reasoning behind the inadmissibility was explained—and the more detailed the explanation, the more likely they were to disregard the evidence. People can often ignore learned information, the researchers argue, but do so only if they really want to.