Weighing Evidence

A most interesting article has come to light about the unwillingness (inability) of persons like you and me to weigh evidence fairly if it touches on an issue we feel strongly about. In fact recent studies showed that a balanced perspective presented to people who have strong feelings about such things as capital punishment simply made them cling all the more strongly to their original point of view. Consider the following two paragraphs that address the question of whether presenting a balanced argument to people who are deeply committed to a particular point of view will help them change their minds:

The remedy for easing such polarization, here and abroad, may seem straightforward: provide balanced information to people of all sides. Surely, we might speculate, such information will correct falsehoods and promote mutual understanding. This, of course, has been a hope of countless dedicated journalists and public officials.

Unfortunately, evidence suggests that balanced presentations — in which competing arguments or positions are laid out side by side — may not help. At least when people begin with firmly held convictions, such an approach is likely to increase polarization rather than reduce it.

This is disturbing. What it amounts to is “don’t confuse me with the facts, my mind is made up!” And I gather we all succumb to this intransigent position on most issues we hold dear.

What we do, apparently, is weigh the evidence that supports our own conviction more heavily than we do conflicting evidence — which we tend to dismiss. So much for John Stuart Mill’s notion that an intelligent person will attempt to see both sides of an issue before making up his or her mind. If we already lean in one direction or the other on an issue (and who does not?) we will simply find the evidence that supports our point of view compelling and the evidence on the other side weak and unconvincing — even if an outside observer might insist that what we regard as the weaker evidence is in fact the stronger.

As a person who spent his life dedicated to trying to help young people gain possession of their own minds, to become thinking human beings rather than performing robots, this article is  disturbing. But please note that my deeply held conviction that people can learn to be reasonable is being shaken by an argument I am not comfortable with — and yet I see the strength of that argument in spite of the fact that it calls into question everything I have taught for nearly 50 years. Isn’t this in itself an argument against the conclusions of the study examined in the piece for the New YorkTimes? An interesting paradox!

In any event, the article goes on to tell us that the only way we can really change a person’s mind is to have someone they respect — say someone they identify closely with or someone whose opinions they have always revered — evidence a radical alteration of opinion. If, for example, I revere George Will and read that he has decided that the Republican party no longer stands for the ideals and values that he holds close to his heart, that he has decided to become a Democrat and vote for Obama — if, I say, I read that this has happened, then I am likely to change my mind as well. I was always told that this was an appeal to authority and that it is a fallacious way to reason. But apparently it works. This would mean, if it is true, that reason is a slave to the passions, as David Hume told us more than a hundred years ago. And he had no psychological tests to revert to. He just found it to be the case.

But then there’s that nagging factoid hanging out there: I find the study summarized in the above article convincing even though I also find the conclusions of the research cited in conflict with my most deeply held beliefs. I am not aware of anyone I admire who has changed his mind about this question, yet I find myself increasingly inclined toward a disturbing point of view. That seems to make the conclusions of this study a bit less disturbing.