How to Persuade Stubborn People

How to Persuade Stubborn People

 

As anyone who has tried to educate people with facts knows, it is very difficult to persuade stubborn people.

New research sheds some light on this frustrating dynamic.

As NPR noted in July:

New research suggests that misinformed people rarely change their minds when presented with the facts — and often become even more attached to their beliefs.

***

A new body of research out of the University of Michigan suggests that's not what happens, that we base our opinions on beliefs and when presented with contradictory facts, we adhere to our original belief even more strongly.

The phenomenon is called backfire, and it plays an especially important role in how we shape and solidify our beliefs on immigration, the president's place of birth, welfare and other highly partisan issues.

***

It's threatening to us to admit that things we believe are wrong. And all of us, liberals and conservatives, you know, have some beliefs that aren't true, and when we find that out, you know, it's threatening to our beliefs and ourselves.

***

This isn't a question of education, necessarily, or sophistication. It's really about, it's really about preserving that belief that we initially held.

As I pointed out in March:

 

Psychologists and sociologists show us that people will rationalize what their leaders are doing, even when it makes no sense ....

Sociologists from four major research institutions investigated why so many Americans believed that Saddam Hussein was behind 9/11, years after it became obvious that Iraq had nothing to do with 9/11.

The researchers found, as described in an article in the journal Sociological Inquiry (and re-printed by Newsweek):

  • Many Americans felt an urgent need to seek justification for a war already in progress
  • Rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe.
  • "For the most part people completely ignore contrary information."
  • "The study demonstrates voters' ability to develop elaborate rationalizations based on faulty information"
  • People get deeply attached to their beliefs, and form emotional attachments that get wrapped up in their personal identity and sense of morality, irrespective of the facts of the matter.
  • "We refer to this as 'inferred justification, because for these voters, the sheer fact that we were engaged in war led to a post-hoc search for a justification for that war.
  • "People were basically making up justifications for the fact that we were at war"
  • "They wanted to believe in the link [between 9/11 and Iraq] because it helped them make sense of a current reality. So voters' ability to develop elaborate rationalizations based on faulty information, whether we think that is good or bad for democratic practice, does at least demonstrate an impressive form of creativity.

An article yesterday in Alternet discussing the Sociological Inquiry article helps us to understand that the key to people's active participation in searching for excuses for actions by the big boys is fear:

Subjects were presented during one-on-one interviews with a newspaper clip of this Bush quote: "This administration never said that the 9/11 attacks were orchestrated between Saddam and al-Qaeda."

The Sept. 11 Commission, too, found no such link, the subjects were told.

"Well, I bet they say that the commission didn't have any proof of it," one subject responded, "but I guess we still can have our opinions and feel that way even though they say that."

Reasoned another: "Saddam, I can't judge if he did what he's being accused of, but if Bush thinks he did it, then he did it."

Others declined to engage the information at all. Most curious to the researchers were the respondents who reasoned that Saddam must have been connected to Sept. 11, because why else would the Bush Administration have gone to war in Iraq?

The desire to believe this was more powerful, according to the researchers, than any active campaign to plant the idea.

Such a campaign did exist in the run-up to the war...

He won't credit [politicians spouting misinformation] alone for the phenomenon, though.

"That kind of puts the idea out there, but what people then do with the idea ... " he said. "Our argument is that people aren't just empty vessels. You don't just sort of open up their brains and dump false information in and they regurgitate it. They're actually active processing cognitive agents"...

The alternate explanation raises queasy questions for the rest of society.

"I think we'd all like to believe that when people come across disconfirming evidence, what they tend to do is to update their opinions," said Andrew Perrin, an associate professor at UNC and another author of the study...

"The implications for how democracy works are quite profound, there's no question in my mind about that," Perrin said. "What it means is that we have to think about the emotional states in which citizens find themselves that then lead them to reason and deliberate in particular ways."

Evidence suggests people are more likely to pay attention to facts within certain emotional states and social situations. Some may never change their minds. For others, policy-makers could better identify those states, for example minimizing the fear that often clouds a person's ability to assess facts ...

The Alternet article links to a must-read interview with psychology professor Sheldon Solomon, who explains:

A large body of evidence shows that momentarily [raising fear of death], typically by asking people to think about themselves dying, intensifies people's strivings to protect and bolster aspects of their worldviews, and to bolster their self-esteem. The most common finding is that [fear of death] increases positive reactions to those who share cherished aspects of one's cultural worldview, and negative reactions toward those who violate cherished cultural values or are merely different.

 

Conservative Frank Luntz and liberal George Lakoff have used the principles of neuroscience to show that facts are less important in persuading many people than "framing". This is an important subject to learn about, to become a more effective communicator.

In the meantime, however, there may be an easier shortcut for persuading stubborn people.

Specifically, start by asking the following question:

Do you want to defend your feelings and beliefs or do you want to know the truth?

Most people will respond by saying "I want to know the truth, of course".

They will say that because they don't want to appear irrational, even if they usually are.

You can then start conveying facts, but repeatedly be sensitive to their feelings of resistance to the challenging facts you're presenting, by saying things like:

"I found this hard to believe when I heard it, too"

"It's hard for everyone to change our minds"

"I know this is contrary to what we've been taught"

"I know it would be [painful or scary or infuriating or other adjective conveying a negative emotion] to believe that [the thing they don't want to hear about]"

And if they are resisting hearing the facts, gently remind them that they said they wanted to know the truth.

If you don't use these techniques, then the stubborn person's automatic and unconscious processes will ensure that he or she will cling to old belief system no matter what you say. Remember, while you may be able to think logically, many people make most of their decisions based on emotions and faulty belief systems. Assuming that everyone uses the same decision-making process you do is the main impediment to going beyond preaching to the choir and persuading others.

There is some percentage of people who will never believe the facts, no matter how you say it. Sometimes it is best just to drop it. But the above-described techniques may work on a large percentage of stubborn people.