“If you wish to strive for peace of mind and happiness, then believe; if you wish to be a disciple of truth, then inquire.” – Friedrich Nietzsche
A few years ago, for around $20, golfers in the US could buy a wand-like device called ‘The Gopher’ which, it was purported, could detect a golf ball when it was lost in the rough.
After a while, a re-labelled version of the device appeared (‘The Quadro Tracker’), this time, however, it was being sold as a device capable of detecting drugs and explosives. Until it was declared a fraud by the FBI in 1996 that is.
Despite that, in 2001, the device re-surfaced in the UK as ‘The Mole’, where it was marketed to Government agencies who, after testing, concluded the claims made for it were completely misleading and potentially very dangerous.
You’d like to think that would have been the end of it. You’d be wrong.
The device was again repackaged and renamed, this time marketed overseas for around £5,000 a time as the ‘ADE-651’ by Somerset-based businessman, James McCormick.
Iraq is said to have spent more than £53m on the devices between 2007-2010.
The device was also marketed internationally as the ‘Alpha 6’ by Samuel and Joan Tree, as well as the ‘GT200’ by Gary Bolton, who was said to have sold one device for £500,000.
These bogus devices were all completely useless pieces of equipment – essentially empty plastic cases with an aerial that would move back and forth according to the user’s unconscious hand movements, a phenomenon known as the ideomotor response.
Thankfully, the global scam eventually came to light and five people were sentenced to prison time by 2014. However, the story doesn’t end there.
Almost unbelievably, after over 250 people were tragically killed by a car bomb in central Baghdad last week, the Iraqi Primer Minister ordered security services to stop using the fake bomb detectors at checkpoints.
It turns out that for the last 9 years, the Iraqi forces have continued to use the devices to try and stop car bombs, despite numerous warnings they are totally useless.
Indeed, according to The Guardian, since 2007, at least 4,000 people have been killed or maimed by bombs driven past police or soldiers using the device.
Even after this week’s order to stop using the device, The Guardian article reports some security officials were still using them, suggesting:
“The reluctance to acknowledge them as useless was in part centred in having to acknowledge that there are few alternatives to keeping bombers away from Iraq’s towns and cities.”
With a senior ministry official even admitting:
“Sometimes it is better to pretend…To say that these don’t work says that we don’t have anything better.”
This is an example of wilful blindness on an industrial scale.
In her excellent book, Margaret Heffernan describes being wilfully blind as when we (consciously or subconsciously) choose to ignore what we “could know, and should know, but don’t know because it makes us feel better not to know.”
This got me wondering. Whilst clearly nowhere near as tragic or consequential as the events in Iraq; in coaching and education, what might we be doing that we know doesn’t work, or we have little to no evidence for, but we continue to do anyway?
For example, we know talent ID at a young age is little more than a stab in the dark, it’s done anyway. We know learning styles have no evidence base, they’re included in lesson planning anyway. We know 10,000 hours of practice is not the key to becoming an elite athlete, it’s prescribed anyway. There is little evidence that homework benefits primary school kids, it gets assigned anyway.
This reminds me of Neo’s quandary in the classic sci-fi film, The Matrix:
Ultimately, is it just easier (even human nature) to take the blue pill, remain ignorant and crack on with current practice at the expense of some potentially harsh truths, hard thinking and acceptance of mistakes? Or, do we need to take the red pill and work much harder, at all levels of society, at being more like Bullshit Man?
Thought provokers and points for reflection:
- Are you doing anything you know (or might suspect) doesn’t actually work? Why?
- Can you think of any other examples of practice in your field we know doesn’t work, but is still common?
- Are we too focussed on trying to get the wrong things right?