People are Bad at Evaluating Future Risk

People are bad at evaluating a risk that is in the future. We seek certainty and struggle with shifting probability or uncertain error margins. This can lead to polarised thinking where we either grossly underestimate a risk, or grossly overestimate a risk.

Most people are fine at understanding relative numbers within two orders of magnitude. That is, we can easily comprehend up to 10 of a thing (first order of magnitude), and frequently up to even 100 of a thing (second order of magnitude), but fail when the number is greater than that.

For example, if I ask you to indicate how much area we need for 200 people you will indicate a certain amount of area. If I had instead asked you to estimate the area we need for 400 people, it would be only marginally larger. If I ask you to estimate one and then the other, you will double the area of the first – but that second area wouldn’t have been the area you picked if I asked you for 400 first. Because the number is greater than 100, we struggle to comprehend it. This is because in nature, when we were roaming around the fields looking for food, the difference between 100 animals and 1000 animals, or a small tree full of berries and a large tree full of berries, was insignificant.

The reverse is also true. When we start looking at 100th or 1,000th.

Our math has got pretty significantly tricky in the last few hundred years. We can do some really fancy stuff. Logarithmic math is really handy for dealing with numbers on vastly different scales of magnitude. However most people don’t really understand what that means. For example, the Voyager probes (1 and 2) had just under 64 kb of memory each. My mobile phone has an extended memory of 64 gb. We recognised that gb is bigger than kb, but by how much? If a byte is an ice cube worth of water, then 64 kb (kilobytes) is half of a bathtub. Now pause and try go figure out what 64 gb (gigabytes) is going to look like.

What do you picture?

The volume of 64 gb (gigabytes) of ice cubes is 2,500 olympic swimming pools, which still defies comprehension except that we get the idea that this is a lot of water. I’m fairly confident that you didn’t think your phone’s storage was that big when compared to the amount used in the Voyager probes.

Still, these are fixed numbers – so even if we can’t really comprehend the scale of it, we can work better with it than a moving number. We struggle when the numbers shift in time. We want a definite thing to plan for and against, ironically so that we can change the definite thing.

We saw this with the millenium bug back before 2000. Our computer scientists warned us that the mechanism of how our computers calculated the date was going to fail. A design flaw that was created for efficiency was not supposed to still be in use by 1995. The problem was that people weren’t upgrading as they should be doing and we really needed them to. What is the worst that could happen? Well, if enough computers fail, then modern civilisation halts. Planes could literally fall out of the air, power stations could stop, water could stop and databases could be irreversibly corrupted (health databases, banks and government registers). Just think about all the things you currently rely on electricity to do. 

“Of course”, said the scientists, “we could just fix it”. The computer scientists estimated the world cost for upgrading the key systems that would ensure that civilisation continues and were ignored for another few years. Which magnified the cost by roughly 100 times. Not percent (which would have just doubled it), but 100 times. It cost up to 1.6 trillion dollars to fix world wide.

Here’s the irony. Because it was fixed, there was no disaster.

People felt justified in calling it a hoax. Even though over 30,000 failures were reported around the world to relevant computing bodies (the systems that weren’t fixed). It was known that not all systems could be fixed due to the long delay in acting, but if enough systems were fixed, the infrastructure would be robust enough to survive a few failing systems.

This is kind of like getting upset that your car didn’t crash like your mechanic warned you about, if you don’t get your brakes repaired. Heeding the warning of your mechanic, you repair your brakes, and then… lo and behold, your car doesn’t crash. Whoa…. spooky.

Climate change is another similar problem. Our world scientists have warned us of a problem. The science is really solid that this is a real problem and that we are going to have dire consequences if we fail to act. That the longer we wait, the worse it is going to get, and the more it will cost to fix.

Much like the millenium bug, people are not taking it seriously because it is hard to comprehend. How bad will it be? Scientists can’t tell you for certain, because it is a moving target. But they all agree it will be bad.

Going back to a previous analogy, if you don’t change the brakes in your car, how bad will it be? Well, you will crash when they fail. But how bad will it be? That depends on the crash. You could just crumple a fender, or you could kill a busload of children, or a range in between.

The denier will say “if you can’t tell me how bad the crash is going to be, I don’t believe you know what you are talking about” and refuse to change their brakes.

Here is the irony though. If we do the world actions needed to stave off climate change, the denier will call it a hoax. If we fail to act, human life on earth might end (worst case bus full of school kids scenario, and kind of hard to come back from), or we might just lose 2/3 of the population. As in 5 billion people. If we do act, we can stave off the worst of that. We are going to lose people, our goal is to try to make that number as small as possible.

Recently there has been a virus outbreak, which has been named COVID-19. Extrapolating from early numbers in Wuhan China indicated that this corona virus had the potential to be really bad. It might also be mild. Extrapolating from a small number set usually has problems.

Consider a random sample of 10 people. 3 of them have blue eyes. Does this mean that the whole population of the world has 30% blue eyes, or that the only blue eyed people in the world were in that sample of 10 people? We just don’t know. So we increase the sample size to 100. It turns out that 20 people in this group have blue eyes. That tells us that there are more than 3 people in the world with blue eyes. We adjust our prediction to 20% of people have blue eyes, but it is still possible that there are only 20 people in the world with blue eyes. Now we look at 1,000 people. 100 of the 1,000 people have blue eyes. What does this mean when we try to extrapolate a trend? Does that mean that only 10% of the world has blue eyes, or that the region that we are sampling from has less than usual, or more?

This is the kind of difficulty that trying to predict how deadly a virus is. As the virus spreads to more people, more accurate predictions of the virus can be made. Yet deadliness is a deceptive term. A virus that is so deadly that it kills its host before it can spread has a 100% deadliness, but poses no real risk to humanity (except for the single host). For a virus to be a risk means it has to be able to spread. This is referred to the reproductive number, or R0 (pronounced “R Naught”). This is a combination of how good the virus is at infecting others and how much people can mitigate that ability. I might normally be the fastest runner at school, but if you break my legs, I’m not going to win the foot race. My ability to win the race has been mitigated. Falling on an object that punctures my lung hundreds of kilometres away from a hospital increases my risk of dying from that wound compared to puncturing my lung in a hospital. The punctured lung is inherently bad, but the risk of death goes up the further away from a hospital I am.

This shifting risk makes it hard for people to grasp risk.

In the case of COVID-19, in the best circumstances, it appears to infect some people asymptomatically. That is, they experience no ill effect and don’t even know they have it. Humans frequently have infections that spread throughout the population without them knowing about it. If COVID-19 only did this, mostly we wouldn’t care.

But it doesn’t.

There are people experiencing serious to deadly outcomes, and potentially 4% of the people infected have died. Or has it. We have not tested every person in the world to get proper numbers, so we can only extrapolate based on the information we have – I refer to the above blue eyed problem.

Factors that change the inherent deadliness of the virus for the individual are related to age, pre-existing conditions, luck, how good the hospital near you is, how overwhelmed the hospital near you is, when you go to seek treatment and so on. There is no fixed number to define this.

Which means that most people don’t comprehend how dangerous the virus actually is.

Another two factors are news coverage and misinformation.

People who report News have their primary goal on profit. Their job is to sell advertisements, which means capturing your attention so you can see those advertisements. Which means hyping everything up. News outlets generally either exaggerate the risk, or complain that the risk has been exaggerated. This leads to a false polarisation of information – it is either really deadly so be afraid, or everyone is lying about how deadly it is, aren’t they silly? Seem familiar? Gone is the myth that news was reported impartially.

Then we have special interest groups who are pushing an agenda above and beyond selling adverts. They are pushing misinformation. Fox media, owned by Rupert Murdoch, shows very strong bias for Murdoch’s interests. Gina Rinehart is a mining billionaire in Australia who pays special interest groups to confuse the public about the risks of mining and fossil fuels. The West Australian Newspaper keeps publishing “opinion pieces” that require no fact checking over “scientific evidence” that does, which strangely enough coincides with the Murdoch agenda and the Rinehard agenda.

A quick way to tell the difference between real news and special interest news is this: If it is a prediction, real news will say “it depends on these factors” with a recommendation to make those factors more favourable; while special interest news will either attack the authority of specialists or give you a definite fixed risk assessment – a confident lie.

We look for something solid to base our next plans on, so we tend to fall for the confident lie rather than taking the time to understand that the outcome is not fixed and we need to understand the factors involved to navigate this threat.

So take that time. If it is a concern for you, take the time to learn about the basic science behind it and ask questions. People who give you confident certainty should be trusted less than people who say “it depends on these factors”. Learn some basic science, learn critical thinking skills, look to international agreement from world scientists – they are usually right, ask “where is this information from, and what is their agenda?”. For example, a strangely named source that refers mostly to itself versus the Australian Government’s CSIRO science department – I’d trust the CSIRO. Educate yourself, but be careful not to fall down a conspiracy theory hole. Most conspiracy theories are wrong.