Select Page

At the time of writing, the UK is on the cusp of an historical moment as the people of Scotland decide whether or not Scotland should be an independent country.

As can be expected when opposing viewpoints square off, there is a lot of discussion.  Debate.  Arguments.  It reminds me of the tribalism in the nutrition world: paleo vs. vegan.  Clean eating vs. moderation.  Quality vs. quantity.  What to do?

Surely the goal is to do what’s best for us (or our community) in a given context.  Surely the goal is, therefore, to weigh up the information and available evidence and make a rational, objective decision?

The problem is, we humans aren’t as rational as we tend to believe.  We pick sides, join tribes, and let our emotions cloud our judgement.  We’re also prone to making some basic errors in our thinking.  So, while it’s not for me to tell you what to think, there is value in considering the quality of your thought.

To do that, lets consider some common biases and flaws we make in our arguments.

 

A better way to argue

Shout louder! Simply trying to ‘win’ an argument will probably cloud your judgement and strain your relationships.

 

Confirmation bias

A confirmation bias is where we interpret information in a way that backs up our pre-existing belief.  We pick sides then, if conflicting information comes our way, we’ll likely swat it away without consideration. …That’s if we even see it.  ‘Theory-induced blindness’ suggests that “once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws.[i]”

So rather than question our theory, we quickly reject the information or observation that competes with it.

Example: A ‘red meat is bad’ study comes out.  The vegetarians embrace it unquestioningly and go share-crazy.  The paleo crowd scrutinise it to death.

 

Cherry picking

Cherry picking is similar to the confirmation bias and is rife in health and fitness (and probably in political debates too).  This is where we selectively choose the studies or information that back up our belief and ignore any evidence to the contrary.

Example: Focusing on a study that suggests low-carb diets are best for weight loss, despite the existence of several other studies that suggest this is not the case.

 

In-group bias

When we surround ourselves with people who all share the same belief, it creates an illusion of consensus that shelters us from different ideas.

Example: When I followed the paleo diet, I read paleo books, visited paleo websites and listened to paleo podcasts.  This gave the impression that all the experts supported the paleo diet.  In reality it was an in-group bias and there are plenty of people who believe differently.

 

Ad hominem

This is when a claim or argument is rejected based upon some irrelevant fact about the person making the claim.  Rather than contest the argument, we simply attack the person.

Example: Someone presents evidence to suggest genetically modified foods are safe.  Rather than objectively analyse the evidence, attackers accuse the presenter as being a shill for the food tech companies.

 

Straw man

A ‘straw man’ is when we misrepresent a person’s argument to make it easier for us to tear down.

Example: Me: “Sugar is not inherently fattening.”  Debater: “So what you’re saying is I can eat sweets all day long and not get fat?  That’s absurd!”

 

Overconfidence & the hindsight bias

Because of the hindsight bias, unpredictable things seem like they were inevitable – we knew it all along.  In reality, the world isn’t as predicable as we like to think.  And we fail to recognize that there are limits to the expertise of those who make predictions – despite their confidence.

In Thinking, Fast and Slow, Nobel prize winning psychologist, Daniel Kahneman argues that expert judgment is notoriously unreliable.  Simply put, “Overconfidence arises because people are often blind to their own blindness.”

Example: My nutrition beliefs have changed considerably over time.  Yet, at any given moment, I always felt highly confident that my philosophy at that time was the ‘right’ philosophy.  I had no doubt it would get me where I wanted to go.  Looking back, my mistakes seem obvious, even though I was blind to them at the time.

 

A point on debating with others

When you see these logical fallacies creep up in an argument, by all means expose them.  This can help people make more rational, objective decisions.

However, forcing an opinion on someone else is unlikely to get them to change their position on a topic.  Theory-induced blindness suggests it’s more likely to encourage them to dig their heels in even more.  This is where bickering comes in and debates are no longer helpful.

“Respect is like air. As long as it’s present, nobody thinks about it. But if you take it away, it’s all that people can think about.  The instant people perceive disrespect in a conversation, the interaction is no longer about the original purpose – it is now about defending dignity.[ii]”

So rather than attempting to see things for what they are, tribalism takes over and it becomes about trying to ‘win’.

…All the while we forgot what was really important.

 

BIG IDEA: We’re not as rational as we like to think.  How we see the world and the decisions we make are heavily influenced by cognitive biases, illusions and errors of thought.

TAKEAWAY: Simply trying to ‘win’ an argument will probably cloud your judgement and strain your relationships.  Although they can be hard to recognise in ourselves, an appreciation of these logical fallacies may help you make better judgements in debates with yourself, and with others.

ACTION STEP: For further reading, check out the following books:

Always Keep Reaching!

Mike

store-amazon

 

References

[i] Daniel Kahneman, Thinking, Fast and Slow

[ii] Patterson et al, Crucial Conversations