The Covid-19 pandemic has made it brutally clear that politicians struggle to understand simple statistics. But they are not alone – too many businesspeople are also ignorant of basic statistical methods, or accept the integrity of sources unquestioningly.
Such misunderstandings can easily damage profits or growth prospects in your company, given the increasing reliance on complex data analysis in everyday business. But they are easy to address.
I will explain how – but, first, let’s look at the problem with Covid statistics. Some people – including respected scientists – now believe politicians could be jeopardising our economic future because of a failure to understand the straightforward statistical problem of false positives in medical testing.
The UK government has increased the number of Covid tests to over 240,000 a day, according to some reports.
Researchers in Geneva have found that the false positive rate for coronavirus tests in current use is 4%. Many people will think: ‘96% doesn’t sound too bad, we can live with that accuracy.’
But a 4% false-positive rate does not mean 4% of positives tests are false. At the current rate, it would actually mean 6,900 people a day (4% of 240,000) would be identified falsely as having Covid. It cannot be that high currently, given that the number of daily new cases is under 5,000 at the time of writing.
But, even at, say, 2%, the false positive problem would mean a large majority of those new cases would be identified incorrectly. Indeed, lockdown sceptics believe around 90% of new cases are false alarms.
Others take a different view – here are some of the latest counterarguments. The government has said it thinks the false positive rate is less than 1.6% and even the World Health Organisation says countries should ‘test, test, test’.
But whether you go along with the sceptics or not, the uncertainty around this figure of ‘new cases’ is clear. Yet it is one of the prime metrics driving government policy towards closing parts of the economy again, with potentially severe economic consequences.
A famous experiment showing how doctors misunderstood false positives in cancer screening suggests that policymakers should take this problem seriously.
David Eddy studied routine screening for breast cancer among women aged 40. The cancer had an actual prevalence in that population of 1%. But the mammogram screening test had a false positive rate of 10%, and a false negative rate of 20%. Eddy asked doctors what is the probability of a woman with a positive test actually having cancer?
95% of doctors estimated between 70% and 80%. But the correct answer is only 7.5%. They failed to see that, if 10% of the overall population of 40-year-old women test positive incorrectly, that creates a large majority of false alarms relative to the true number.
Many people have difficulty understanding much simpler statistical methods than this as well. John Paulos demonstrated this eloquently in his book ‘Innumeracy: mathematical illiteracy and its consequences’.
One simple example in the book is the black-red card game. Imagine a man with three cards – one black on both sides; one red on both sides; and one black on one side and red on the other. He asks you to pick one, but only to look at one side. It’s red.
The man then offers you an even money bet that yours is the red-red card – if it is, he wins. Since there are only two remaining cards with red on, the bet might seem fair to many people. But, in fact, conditional probability dictates that because there are three red sides in those two remaining cards, he has two chances of winning, but you only have one.
Worse than chimps
Another problem with statistics is that people often ignore them completely – preferring to rely on preconceptions and gut instinct – sometimes with dire consequences.
In his brilliant book Factfulness, Hans Rosling showed repeatedly that educated people, when asked questions such as how many girls in low-income countries finish primary school, performed worse than ‘chimps’ answering randomly with zero understanding.
The participants preferred to rely on their preconceptions that everything in the West is better and conditions in low-income countries are bad and will never catch up, though this is demonstrably untrue.
Capital City Training’s financial course, ‘Data statistics in business’ shows you how to avoid many of these errors and interpret, apply and present data effectively. It gives you the knowledge, guidance, tools and techniques to turn your data into compelling arguments; and enables you to spot arguments based on cognitive biases, or thin or flawed evidence.
Since big data has become such a huge part of business life, learning how to use and interpret numbers is increasingly important – yet everywhere, people are misinterpreting it. Our data statistics course looks at how to avoid mistakes, including in data sourcing, collection and integrity. It also looks at the potential impact of cognitive biases on business results, and shows you how to spot and avoid them.
These biases include:
- survivorship – concentrating only on data from people or things that have survived and ignoring failures
- confirmation – filtering or interpreting evidence to confirm a pre-existing belief
- anchoring – focusing too much on one initial piece of evidence
- bandwagon – believing things because others do
- availability – overestimating the importance of events based on their availability in your memory.
Our online training programme also helps delegates understand:
- how to calculate key statistical measures
- sampling, interpretation and inference techniques to help you argue confidently
- a framework to question data sources and interpretation
- how to present and communicate data in a compelling way, using visualisation.
Such effective treatment of data is relevant to all management functions and areas of business activity – including marketing, finance, HR, operations, logistics, accounting, information systems and technology. This online statistics course is the ideal way to get ahead of the chimps in the way you understand, interpret and present information.
In our next blog, we will look in detail at how to present statistics to create a compelling argument – something US President Donald Trump struggled to do in this recent interview: