Lead Facilitator Richard Brown reviews “Being Wrong: Adventures in the Margin of Error”, by Kathryn Shulz:
Being “right” is one of our cheapest satisfactions, but this book is about what happens when our beliefs and theories, including our most fundamental ones, fail us.
“I told you so“ – these words are either extremely satisfying or horribly frustrating, depending whether you are saying them or hearing them!
Why do we go through life believing we are right, from whether there’s enough milk back home in the fridge, to the whereabouts of the car keys or even WMDs in Iraq?
Changing one’s mind or direction is seen as a sign of weakness in a politician or government (remember “this lady’s not for turning”?) even when it is obvious that the times have changed or the tide has turned, and a common phrase we hear these days from politicians is “just plain wrong”.
Through a mixture of true stories, lessons from history, witty observation, philosophical discourse and psychological reference, Kathryn Schulz makes her book entertaining but extremely thought provoking. Students and practitioners of NLP will find lots of resonance with its language.
One story, of how a journalist, covering a 1972 conference on environmental issues, and reporting on how a key speaker with grim predictions of the future still had enough hope to be pregnant, put the story on his front page – only to find she wasn’t pregnant – surely is only a bigger clanger than some things we have all done.
Kathryn Schulz describes three assumptions that help us deal with challenges to our belief systems:
- Ignorance Assumption (our belief is based on facts, so others just are ignorant of the right facts)
- Idiocy Assumption (OK, others have facts, but are too stupid to understand them)
- Evil Assumption (those who disagree are not ignorant of the truth, but have turned their backs on it) – I really sat up at this point because it sounds so much like Climate Change Denial!
She has some entertaining shorthand for the reasons we give – the“Wrong, But” strategy, the “Cos It’s True” constraint, and some quite astonishing statistics. In the US, the number of deaths from medical mistakes is equivalent to the fatal crash of a full 747 every 3 days – the 8th most prevalent cause of death in the country!
Due to the prevalent litigious culture, the concealment of these errors is practiced and even rewarded. However, she documents one hospital’s efforts to eliminate these, including publicly publishing its medical error data, to ensure that everyone in the hospital learned from them.
Imagine if we could change our thinking so that those four words “I told you so“ were never uttered again. We excel at making “models” of our perceived world, but not so good at realising when we have done so. Then we weigh in with “confirmation bias” – giving more weight to evidence that confirms our beliefs, than to evidence that challenges them.
In all areas of life, from that memory of the contents of the fridge, through basic beliefs and perceptions about our world around us, to global, macro-economic decisions, we stumble along making mistakes and usually not learning from them.
I found my thinking processes changed subtly whilst reading this book – not an implementation of Murphy’s Law (“anything that can go wrong, will”), nor some sudden over-cautious approach to life’s uncertainties – something more subtle.
Listen to the words you use, and hear used around you. Can you head off an “I told you so” moment?
For more information on the above or any related area of interest inorganisation development and learning & development in Edinburgh, Scotland and across the UK please contact Sandy Smith on 0131 333 0066 and firstname.lastname@example.org.
Alternatively, make an enquiry.