As a married person with kids, I often find myself in a position of trying to change someone's mind about something that they couldn't be any more wrong about. Taste in music, television shows, whether or not bowties should exist outside of the 19th century in any context other than the necks of Ph.D.s ... you get the idea. It's my cross to bear, as they say.
Maria Konnikova has written a piece at The New Yorker about the research of several academics who have set out to understand what it takes to change people's minds when they are empirically wrong about something.
What they're finding is that, surprise, surprise, facts and evidence are of little use in changing people's minds. Evidence to the contrary can even help to strengthen incorrect opinions:
They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds.
The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet—focussed on a lack of evidence connecting vaccines and autism—seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines. “It’s depressing,” Nyhan said. “We were definitely depressed,” he repeated, after a pause.
They now believe that beliefs take root within our self-perception. They help us to define ourselves. And just as we have a natural tendency to rationalize away people's criticisms of who we are as people, we are good at deflecting criticisms of our beliefs. So the more ideological a belief becomes, the more difficult it'll be to change it:
The campaign against smoking is one of the most successful public-interest fact-checking operations in history. But, if smoking were just for Republicans or Democrats, change would have been far more unlikely. It’s only after ideology is put to the side that a message itself can change, so that it becomes decoupled from notions of self-perception.