We like to think that we are reasonable and base our beliefs on the evidence. When we come across information which demonstrates that we should change our minds, we are open to doing so. Or are we? Matthew Syed gives us some food for thought on this in his highly acclaimed book, Black Box Thinking.
He directs us to consider an experiment conducted at Stanford University by Charles Lord. Two groups of volunteers were recruited with differing views on capital punishment. One group was adamantlyin favour, seeing it as a genuine deterrent to crime. They felt strongly on the issue and were public supporters of the policy. The second group was firmly against it, horrified by the brutality of what they saw as “state-sanctioned murder”.
These groups were then shown two dossiers. Each was highly impressive and included well-researched evidence. The first dossier contained a collation of evidence which made the case in favour of capital punishment. The second collated only evidence against it. You could be forgiven for thinking that this contradictory evidence might lead the two groups to conclude that capital punishment was a complex subject with strong arguments on both sides. You might have expected them, while not changing their mind on the issue, to have a little more sympathy with the views of those on the other side of the argument. In fact, the opposite happened: they became more polarised.
When later asked about their attitudes, those in favour of capital punishment said they were impressed with the dossier citing evidence in line with their views. The opposite conclusions were drawn by those against capital punishment. Ironically, from reading precisely the same material, they became even more entrenched in their positions.
What this (and many other similar experiments) demonstrate is the way we filter information when it challenges our strongly held views or convictions. We use a series of manoeuvres to reframe what is inconvenient to our original position. We question the truth of the evidence, or the credentials of the people who discovered it, or their motives, or find some other reason to discredit them. As more information emerges to challenge our standpoint, we search for new justifications in ever more creative ways so as to become more entrenched in our prior view. This is a tendency called “cognitive dissonance”.
Syed cites a famous and worrying example of cognitive dissonance that took place in the lead-up to the Iraq War. On September 24, 2002, before the conflict, Tony Blair made a speech in which he emphatically stated: “Saddam Hussein’s Weapons of Mass Destruction [WMD] programme is active, detailed and growing … he has existing plans for the use of weapons, which could be activated in 45 minutes.”
How to continue reading…
This article appears in the Catholic Herald magazine - to read it in full subscribe to our digital edition from just 30p a week
The Catholic Herald is your essential weekly guide to the Catholic world; latest news, incisive opinion, expert analysis and spiritual reflection