In today’s fast-paced digital age, information can become outdated rapidly and people must constantly update their memories. But changing our previous understanding of the news we hear or the products we use isn’t always easy, even when holding onto falsities can have serious consequences.
A pharmaceutical company, for example, may present a testimonial about a consumer’s positive experience with a new medication, along with details about the potential side effects and interactions with other drugs. Later, if the company announces that the drug is less effective than previously reported, many people will continue clinging to the belief that the drug is effective, according to results from a new study. The findings are available online in the Journal of Consumer Psychology.
When people read or hear stories, they build mental models of events that are linked together in a cause-effect chain, and this is embedded into their memories, says study author Anne Hamby, an assistant professor in the department of marketing at Boise State University in Idaho. Even if they later discover that one aspect of the chain of events is incorrect, it’s difficult for people to change their memory of a story because this would create a gap in the chain. This is known as the continued influence effect.
Hamby and her colleagues were interested in testing whether the continued influence effect was more common when stories included an explanation for the outcome of the story–rather than leaving out this detail. In one experiment, participants read about a man who was diagnosed with a disease and took a prescribed medication at night and with a glass of lemonade. The drug does not work and he returns to the doctor. Half of the participants read that the doctor explains why the drug did not work: The man needed to take the medication in the morning because hormones released at night block the effectiveness of the drug. The other participants did not get an explanation about why the drug was ineffective.
At the end of the story, all the participants are presented with another fact: citrus-based foods and drinks interfere with the absorption of the drug. Later, they learn that this information was false. The results showed that participants who did not get an explanation for the drug’s ineffectiveness had difficulty rejecting the falsity of the drug-citrus interaction. “This group had used the citrus interaction to explain why the drug didn’t work in the story, while the other group already had an explanation in mind,” Hamby says. “Once the first group inserted causal information into a mental model of story, it was harder to remove it.”
Though it’s difficult to change an existing version of events, the researchers discovered that people are more willing to update their memories if something bad has happened to a character, such as a death or serious illness. “People are more motivated to do the mental work of updating the story if the change leads to a better outcome because the character’s well-being could be related to their own well-being,” she says.
Hamby hopes the findings will inform how companies and news organizations retract misinformation. “It may not work to simply send out a press release or make a public service announcement saying that information is incorrect,” she says. “In order to effectively change beliefs, we need to give consumers an alternative cause and effect explanation.” For example, rather than saying previous studies linking autism to vaccines are false, it may be wiser to explain other causes of autism.
She also hopes that the findings will promote tolerance of others. “We tend to assume that if something is factual, people should accept it,” Hamby says. “We are not always aware of how information gets lodged into our memories, and perhaps this understanding could reduce some of the partisan animosity in today’s political climate.”