And Why It Sometimes Doesn't

Introduction to the Getting it Wrong theme

Nothing works perfectly. Not even evidence based science. In fact anything that humans do is bound to be infused with imperfection. This theme will look at all the ways we get it wrong, a sort of comedy of errors, from cheating to cognitive bias and from failing to observe the rules of evidence to science denial.

Cognitive bias

A cognitive bias is when our reasoning breaks down, Not necessarily because we lack the ability to reason effectively, but because the way we process and evaluate information has natural flaws, for example in our perception. There are a surprisingly large number of these. Wikipedia lists nearly 200 of them, although many are pretty obscure or controversial. The graphic below is difficult to read unless you can zoom in, but it gives an idea of just how big a subject this is. Should keep me busy for years.

The Cognitive Bias Codex. Source Wikipedia - Design JM3, John Manoogian III - Concept, Buster Benson

The Cognitive Bias Codex. Source Wikipedia - Design JM3, John Manoogian III - Concept, Buster Benson

Two of these have already been published here. The first is on the causality bias. I've given it the latin name 'Post Hoc Ergo Proctor Hoc' which means 'After This Therefore Because of This.' The second is on Confirmation Bias, how we seek evidence that supports a preconception and ignores or undermines those that don't.

The scientific method shows how scientific enquiry is a constantly refining and self-challenging process. Source - Wikimedia, Whatiguana

The scientific method shows how scientific enquiry is a constantly refining and self-challenging process. Source - Wikimedia, Whatiguana

Violating the Rules of Evidence

There have been many examples of scientists ending up with egg on their faces because they didn't follow all the rules of evidence. The examples I will use illustrate the importance of at least one particular rule by describing what happens when they are not followed. The list is rather long but there are three I intend to write about first to set the series going:

In a study by the West Virginia University Real-world NOx emissions were found to exceed the US-EPA Tier2-Bin5 (at full useful life) standard by a factor of 15 to 35 for the VW Jetta and by a factor of 5 to 20 for the VW Passat. Source - Wikipedia, …

In a study by the West Virginia University Real-world NOx emissions were found to exceed the US-EPA Tier2-Bin5 (at full useful life) standard by a factor of 15 to 35 for the VW Jetta and by a factor of 5 to 20 for the VW Passat. Source - Wikipedia, Jzh2074

Naughty Scientists

There are all sorts of reasons why some scientists cheat by fabricating or falsifying experimental results, planting false evidence or plagiarising other peoples work. Examples include the Piltdown Man, Hwang Woo-Suk and his false cloned human embryo, fabricated lock ness monsters and claiming a link between the MMR Vaccine and autism.

More recently there is the infamous fabrication of diesel engine emissions. Some of these still cause controversy and will be well worth a closer look.

Science Denial

There are many reasons why people might find a scientific description of the world inconvenient to acquired belief, and wish to find ways to undermine and then reject it. It includes creationism, conspiracy theories, climate change denial, the anti vaccine movement and prejudices against groups of people with a common characteristic, like skin colour, sexuality or religion.

These are difficult subjects to tackle without upsetting people, but its hard to talk about how knowledge works without referring to the things that act as barriers to its acceptance. An underlying feature that will crop up many times is the ways of reasoning that separate evidence based thinking from ideological thinking. For evidence based thinking we start with the evidence and the description of the world that emerges is what it is, and not necessarily what we would like it to be. Conversely Ideological thinking starts with the idea and then seeks evidence to support it. A political philosophy for example would gather supporting evidence for its position while ignoring the evidence that refutes it.

The Matilda Effect was coind by science historian Margaret W. Rossiter in 1993. Source - Wikipedia, IlluScientia

The Matilda Effect was coind by science historian Margaret W. Rossiter in 1993. Source - Wikipedia, IlluScientia

Giving Credit Where Credit's Due

The most disgraceful example of denial from within science is the extent to which women scientists have been mistreated and denied recognition for their work, in many cases the recognition going to a male colleague. I have already written about Henrietta Swan Leavitt who discovered the importance characteristic of variable stars and launched our senses beyond our own galaxy. See 'Cosmic Staircase: Cepheid Variables' on this site. There are three more illustrated left. The phenomenon is called the  Matilda Effect and I will be writing a series of articles about this in the near future.

And so on

This is just a starter list of categories for how things go wrong in the pursuit of knowledge. I'm sure there will be others.

Roger MouldComment