Unbiased ‘truth’ largely fiction

A question for my Republican friends: Would you still love George W. Bush if he were Bill Clinton?

Seriously. If it were Clinton who had invaded Iraq based on erroneous intelligence, Clinton whose decisions had led to the deaths of more than 1,200 troops, would you be foursquare behind him the way you are Bush? Would you still support him even as he is villified by half the country and much of the world?

And for my Democratic friends: Would you have felt the same about Clinton had he been Bush?

Had it been Bush who had an affair with a White House intern, Bush who looked the nation in the eye and lied about it, would you have been so willing to forgive? Would you still have opposed removing him from office?

In other words, are you guilty of double standards and outright bias?

I’ll save you the trouble: Yes.

That’s not what our self-image says, of course. We like to see ourselves as principled types who sift the facts before forming an opinion. But for most of us, this is pure poppycock. We are perfectly willing to ignore any fact that contradicts what we believe.

Maybe you already knew this intuitively. Now you can know it to a scientific certainty.

For which you can thank Drew Westen. He’s a professor of psychology at Emory University, author of a new and still-unpublished study testing whether people make decisions based on bias or fact. Bias won hands down.

In a key scenario, respondents were led to believe a soldier was accused of torturing people at Abu Ghraib prison in Iraq. The fictional soldier claimed to have been following orders from superiors who told him the Geneva Convention had been suspended. He supposedly wanted to subpoena President Bush and Defense Secretary Donald Rumsfeld to prove his case. Respondents were asked if he should have that right.

Some were presented with strong “evidence” corroborating the soldier’s story. Others had only his word to go on.

But the strength or weakness of the evidence turned out to be immaterial. Researchers were able to predict people’s opinion over 80 percent of the time based simply on their opinions of the Bush administration, the GOP, the military and human rights groups. Those who had less affection for the president sided with the soldier even when the evidence was weak. And fans of the president tended to side with him even when the evidence was overwhelming.

We believe what we want, facts be damned.

“The scary thing,” says Westen, “is the extent to which you can imagine this influencing jury decisions, boardroom decisions, political decisions … “

I’m reminded of a colleague of mine who says we Americans increasingly seem to embrace separate “truths,” reflecting not objective reality, but political orientation. Some of us even get our news exclusively from those sources that affirm our truths. He calls it living in alternate realities.

It’s because of that separateness that there often seems to be no moral center or intellectual coherence to much of what passes for public discourse these days. Our principles are situational, our willingness to marshal critical thought goes off and on like a light switch. We’ll believe — or not believe — whatever it takes to win the argument. Winning it is all that matters.

And never mind that it’s perfectly possible to win the argument and still be wrong.

Westen laughed when I told him I found his study depressing. If there’s a silver lining, it’s the 15 or so percent of respondents who actually bothered to consider the evidence when forming their opinions. We’d all be well advised to follow their example.

Forget what you want to believe. Seek the truth and have the courage to believe that.