Why don't we listen
By Stephen Fitzgerald
The facts don't change our minds
Collectively, what is wrong with us as a species - As pointed out in the previous articles, greed is one driving force that can destroy us but, so can ignorance and our apparent lack of reasoning. To be in with a fighting chance, at survival, we need to understand where we come from.
Reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does today in a world of far reaching instant communication, weighted media and political vested interests. Still, an essential puzzle remains: How did we come to be this way?
Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain. For most individuals, freeloading is always the easiest course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data. Rather, it developed to resolve the problems posed by living in collaborative groups.
Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias.
Confirmation bias leads people to dismiss evidence of new or under-appreciated threats. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we are blind about, are our own.
The task that reason evolved to perform, is to prevent us from getting screwed by the other members of society or our group. Going back to our evolution, living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave.
There was little advantage in reasoning clearly, while much was to be gained from winning arguments. Nor did they have to contend with fake news, or social media. It’s no wonder then that today, reason often seems to fail us. As Mercier and Sperber write, “This is one of many cases in which the environment changed too quickly for natural selection to catch up.” The results is positive uncertainty.
People believe that they know way-more than they actually do. And, what allows us to persist in this belief, is other people. We’ve been relying on expertise in one another ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, that we can hardly tell where our own understanding ends and others’ begins.
One implication of the naturalness with which we divide cognitive labor, is that there’s no sharp boundary between one person’s ideas and knowledge and those of other members of the group. As people invented new tools for new ways of living, they simultaneously created new realms of ignorance.
If everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering. Where it gets us into trouble, is in the political domain.
It’s one thing for me to drive a car without knowing how it operates, and another for me to favor (or oppose) action on climate change or burning fossil fuels as opposed to renewable energy without knowing what I’m talking about.
As a rule, strong feelings about issues do not emerge from deep understanding and here, our dependence on other minds reinforces the problem. If your position on, say, climate change is baseless and I rely on it, then my opinion is also baseless.
When I talk to Tom and he decides he agrees with me, his opinion is baseless. But, now that the three of us concur we feel that much more smug about our views.
If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration. This is how a community of knowledge, or self serving biased opinion, can become dangerous.
If we, or our friends or the pundits at 2GB, spent less time pontificating and more trying to work through the implications of policy proposals, we’d realise how clueless we are and moderate our views. This may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes. First, of course, we need to pull back the vale and expose the dark heart of vested interest but, that’s another storey.
One way to look at science is as a system that corrects for people’s natural inclinations. In a well-run laboratory, there’s no room for my-side bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them. And this, it could be argued, is why the system has proved so successful.
At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails. Science moves forward, even as we remain stuck in place. In “Denying to the Grave Why We Ignore the Facts That Will Save Us” (Oxford), Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves.
Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly. Like catastrophic climate change and coastal cities being lost to inundation from rising oceans.
Ways of thinking that now seem self-destructive must at some point have been adaptive. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component. They cite research suggesting that people experience genuine pleasure, a rush of dopamine, when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.
There must be some way to convince people to look at the science and, how to address the tendencies that lead to false scientific belief. “The Enigma of Reason,” “The Knowledge Illusion,” and “Denying to the Grave” were all written before the recent federal elections. And, anticipate the rise of alternative facts and false information.
It would be nice to think that rational people would be able to think their way to a solution that puts facts ahead of assume and/or, I recon or, blindly believing what we are told. But, unless you are truly open minded and impartial the literature is not reassuring. Deep rooted primal human nature being what it is.
Some people will not change their minds until the facts bite them on the arse and draw blood. So, you can understand why we struggle with action on climate change when we also include the self serving agenda of fossil fuel conglomerates driving false narrative and turning save the planet or green into a dirty word.
In addition to vested interest, there are those that like to keep things how they are so they don't need to go through the uncomfortable task of analising facts and outcomes that may go against what suits them. While others like to analise the facts and look at the projected outcomes without bias.
These are the people, our thinkers and scientists, who need to be embraced by the greater population and those in power, if we have any chance of protecting the natural world and advancing civilisation into a new sustainable energy era.
Basically, up until now, we believe what we want, or we believe what we are told, or we believe what we need to believe and that's not going to save us. To make it through the impending, human driven, environmental crisis, we need to evolve very quickly.
Go to the next article (PDF file): Atmospheric CO2 Management - CCS