Monday, 9 November 2020

The Certainty of Uncertainty


Posted by Keith Tidman
 

We favour certainty over uncertainty. That’s understandable. Our subscribing to certainty reassures us that perhaps we do indeed live in a world of absolute truths, and that all we have to do is stay the course in our quest to stitch the pieces of objective reality together.

 

We imagine the pursuit of truths as comprising a lengthening string of eureka moments, as we put a check mark next to each section in our tapestry of reality. But might that reassurance about absolute truths prove illusory? Might it be, instead, ‘uncertainty’ that wins the tussle?

 

Uncertainty taunts us. The pursuit of certainty, on the other hand, gets us closer and closer to reality, that is, closer to believing that there’s actually an external world. But absolute reality remains tantalizingly just beyond our finger tips, perhaps forever.

 

And yet it is uncertainty, not certainty, that incites us to continue conducting the intellectual searches that inform us and our behaviours, even if imperfectly, as we seek a fuller understanding of the world. Even if the reality we think we have glimpsed is one characterised by enough ambiguity to keep surprising and sobering us.

 

The real danger lies in an overly hasty, blinkered turn to certainty. This trust stems from a cognitive bias — the one that causes us to overvalue our knowledge and aptitudes. Psychologists call it the Dunning-Kruger effect.

 

What’s that about then? Well, this effect precludes us from spotting the fallacies in what we think we know, and discerning problems with the conclusions, decisions, predictions, and policies growing out of these presumptions. We fail to recognise our limitations in deconstructing and judging the truth of the narratives we have created, limits that additional research and critical scrutiny so often unmask. 

 

The Achilles’ heel of certainty is our habitual resort to inductive reasoning. Induction occurs when we conclude from many observations that something is universally true: that the past will predict the future. Or, as the Scottish philosopher, David Hume, put it in the eighteenth century, our inferring ‘that instances of which we have had no experience resemble those of which we have had experience’. 

 

A much-cited example of such reasoning consists of someone concluding that, because they have only ever observed white swans, all swans are therefore white — shifting from the specific to the general. Indeed, Aristotle uses the white swan as an example of a logically necessary relationship. Yet, someone spotting just one black swan disproves the generalisation. 

 

Bertrand Russell once set out the issue in this colourful way:

 

‘Domestic animals expect food when they see the person who usually feeds them. We know that all these rather crude expectations of uniformity are liable to be misleading. The man who has fed the chicken every day throughout its life at last wrings its neck instead, showing that more refined views as to uniformity of nature would have been useful to the chicken’.

 

The person’s theory that all swans are white — or the chicken’s theory that the man will continue to feed it — can be falsified, which sits at the core of the ‘falsification’ principle developed by philosopher of science Karl Popper. The heart of this principle is that in science a hypothesis or theory or proposition must be falsifiable, that is, to possibly being shown wrong. Or, in other words, to be testable through evidence. For Popper, a claim that is untestable is no longer scientific. 

 

However, a testable hypothesis that is proven through experience to be wrong (falsified) can be revised, or perhaps discarded and replaced by a wholly new proposition or paradigm. This happens in science all the time, of course. But here’s the rub: humanity can’t let uncertainty paralyse progress. As Russell also said: 

 

‘One ought to be able to act vigorously in spite of the doubt. . . . One has in practical life to act upon probabilities’.

 

So, in practice, whether implicitly or explicitly, we accept uncertainty as a condition in all fields — throughout the humanities, social sciences, formal sciences, and natural sciences — especially if we judge the prevailing uncertainty to be tiny enough to live with. Here’s a concrete example, from science.

 

In the 1960s, the British theoretical physicist, Peter Higgs, mathematically predicted the existence of a specific subatomic particle. The last missing piece in the Standard Model of particle physics. But no one had yet seen it, so the elusive particle remained a hypothesis. Only several decades later, in 2012, did CERN’s Large Hadron Collider reveal the particle, whose field is claimed to have the effect of giving all other particles their mass. (Earning Higgs, and his colleague Francis Englert, the Nobel prize in physics.)

 

The CERN scientists’ announcement said that their confirmation bore ‘five-sigma’ certainty. That is, there was only 1 chance in 3.5 million that what was sighted was a fluke, or something other than the then-named Higgs boson. A level of certainty (or of uncertainty, if you will) that physicists could very comfortably live with. Though as Kyle Cranmer, one of the scientists on the team that discovered the particle, appropriately stresses, there remains an element of uncertainty: 

 

“People want to hear declarative statements, like ‘The probability that there’s a Higgs is 99.9 percent,’ but the real statement has an ‘if’ in there. There’s a conditional. There’s no way to remove the conditional.”

 

Of course, not in many instances in everyday life do we have to calculate the probability of reality. But we might, through either reasoning or subconscious means, come to conclusions about the likelihood of what we choose to act on as being right, or safely right enough. The stakes of being wrong matter — sometimes a little, other times consequentially. Peter Higgs got it right; Bertrand Russell’s chicken got it wrong.

  

The takeaway from all this is that we cannot know things with absolute epistemic certainty. Theories are provisional. Scepticism is essential. Even wrong theories kindle progress. The so-called ‘theory of everything’ will remain evasively slippery. Yet, we’re aware we know some things with greater certainty than other things. We use that awareness to advantage, informing theory, understanding, and policy, ranging from the esoteric to the everyday.

 

6 comments:

Thomas O. Scarborough said...

On the one hand, CERN scientists have five-sigma certainty, or 99.99994% confidence. However, this is within a closed system. What is their five-sigma certainty doing within a global system? To obtain their five-sigma certainty, they emit around 200,000 tonnes of tCO2e per annum. This is the story of science. Enormous certainty within closed systems, yet what amounts to a loss of control in global systems.

Keith said...

You make a crucial point, Thomas.

I wonder, however, if your question about ‘certainty’ within ‘global systems’ (as opposed to ‘close systems’) has more to do with the push-pull of politics than with science. To exaggerate my point, politicians aren’t about to heatedly debate whether CERN’s confirmation of the Higgs boson is of high probability, or five-sigma certainty. They have no political philosophy or ideological leanings or economic rewards or lobbyists’ interests at stake.

Whereas politicians do see such stakes, like industrial and other economic interests, in some of the global systems, such as those related to, say, CO2 increases since the Industrial Revolution and resultant climate change — especially (but not exclusively) temperature increases and consequential spikes in destructive weather extremes.

To the scientists’ credit, for the most part as a community, they push back against such head-in-the-sand politicians, asserting that their scientific research points to a high probability (if I’m not mistaken, the UN Panel on Climate Change reports some 95 percent probability) of human activity being the main cause. At the risk of oversimplifying, I suspect that politics are more likely to bump up against the science of global-systems issues than of closed-systems issues.

Martin Cohen said...

I thought this was a very nicely written post, Keith, thank you. And the point you make is also to me very important. "Uncertainty taunts us. The pursuit of certainty, on the other hand, gets us closer and closer to reality, that is, closer to believing that there’s actually an external world."

However, I actually disagree with the second sentence. I would say the scientist you quote, Kyle Cranmer, is nearer "the truth" when he says that while we like to say things are certain, there are always assumptions behind them. The history of science is the tale of things demonstrated beyond doubt - which are then found to be completely wrong. The Higgs Bosun seems to me to be more about physicists justifying expensive research than actually understanding the universe. Yes, you can call me old fashioned! Or is it more of a sixties hippy…

Keith said...

Thank you, Martin. I agree with your emphasis on the argument about uncertainty that Cranmer makes. To that point, in fact, I begin the last paragraph of my post with this sentence: ‘The takeaway from all this is that we cannot know things with absolute epistemic certainty’. I think that’s pretty unequivocal in terms of where I stand on our being hacked by unremitting uncertainty.

But where I do differ is in this statement: ‘The Higgs boson seems to be more about physicists justifying expensive research than actually understanding the universe’. That point strikes me as cynical, and calls into question the sincerity of scientists’ motives. The typical physicist, I suspect, is indeed interested in projects that increase our knowledge and understanding of the universe in ways that matter, and are not engaged in avaricious money grabs.

And the fact that science shifts — sometime incrementally, sometimes in revolutionary leaps — shouldn’t come as a surprise. That accretion of insight and knowledge — to include, yes, the notion of ‘paradigm shifts’ that philosopher and physicist Thomas Kuhn explored in his 1962 book ‘The Structure of Scientific Revolutions’ — is the natural order of all fields of inquiry, from the sciences to the humanities. In all cases, knowledge ceaselessly builds. (I wonder what that says, if anything, about our understanding of ‘reality’, beyond temporary moments in time until the next shift and the next.)

Thomas O. Scarborough said...

I find it interesting, Keith, that you use the word 'justify'. How much will we do to justify the science? One has the Higgs boson, yet one needed to create massive tCO2e to do it -- apart from the expense. Now if we think more widely than the Higgs boson, how many of our activities in general do we justify, which are harmful.

Keith said...

‘Now if we think more widely … how many of our activities in general do we justify, which are harmful?’ No doubt my fault, Thomas, but I find a little cryptic what you’re getting at in your question. But meantime I’ll take a stab at a short answer, ‘Lots.’ Not a pretty picture, but I suspect it’s part of the dark side of human nature that at least some among the world’s population will always ‘justify’, and act on, things like racism, misogyny, xenophobia, inequality, genocide, totalitarianism … and much more of that ilk. To which some people seem perniciously predisposed, and dispiritingly ready to ‘justify’.

Post a Comment