A bit of analysis and thought can go a long way in helping us figure out what's real and what's fake, and it's a skill essential to the survival of humanity as history shows, an Irish physicist says.
Dr David Robert Grimes is an Irish physicist, cancer researcher, science writer.
His first book, The Irrational Ape contains case studies showing how easy it is to draw false conclusions and make mistakes - it also suggests ways to avoid them.
He tells Lynn Freeman the key to not falling for false information is critical thinking.
“There’s a few definitions that are commonly used, but I try to give an intuitive definition for people; it is to sceptically analyse things that you are told, check there is enough evidence for what you are accepting and also to employ an analytical mode of thought rather than the reflective one we often fall into – the reactive mode, if you will.”
Dr Grimes says we often believe critical thinking is intuitive, but it’s more a learned and practised response.
“None of us are born critical thinkers, but we all have the capacity to be. To actually get there requires training and to be encouraged, often from an early age.”
He says we can’t turn everybody into rocket scientists, but we can get people to more critically analyse the news they hear and the information they read online.
“I think the explosion in misinformation that we hear online has exposed a flaw that’s always been there. We’ve always been collectively susceptible to falsehood, and there are so many historical examples.”
With the explosion of fake news, it’s even more important we learn to stop these reflective thought patterns, he says.
“Information has never been easier to obtain, but it also makes falsehoods and really dubious lies the ability to perpetuate further and faster than ever before. And we really are reaping the consequences of that. It’s really exposed how much of a gap in our thinking there was.”
The book begins with a great example of critical thinking by Stanislav Petrov in 1983.
Petrov was duty commander in a bunker just outside Moscow which was home to the Soviet Union’s early warning system to keep an eye on American activity.
“This was a really fraught time when tensions between the USSR and the USA were astronomical. Reagan had just denounced them as an evil empire and they had shot down a civilian airliner with a US congressman on board. Everyone was quite worried about the prospect of nuclear war.”
One September morning in 1983, Petrov got the signal that five American missiles were in bound to the USSR.
“His job was to pick up the phone, call High Command and say war has begun. The inevitable response would have been for the USSR to launch everything they had. The vast majority of life on earth could have been wiped out.”
However, instead of ringing High Command, Petrov made a different phone call. He said the detector was faulty and needed repairing.
“Everyone at the time thought he was crazy, but the reason he came to that conclusion was because he thought about it for a few minutes.”
Petrov was first made sceptical by the fact the missiles weren’t corroborated by ground radar. He also thought that if the Americans had launched an attack, five missiles was a measly amount and they would have gone at the USSR with everything they had.
“On balance of probability, he decided it was far more likely an error and he was correct – it was actually reflections of low cloud. That almost ended the world, so to speak, and only that Petrov had the level-headedness, while everyone else panicked, to say no, I don’t think this is right.”
Petrov was, in fact, punished for making that decision. But Dr Grimes says humanity owes him a lot.
Dr Grimes says a recent Standford experiment looked at so-called digital natives to see how good they were at detecting falsehoods when given misinformation. The researches described their results as ‘bleak’ and ‘a threat to democracy’.
“The reason why is that these very able students were totally unable to distinguish between reputable and reprehensible sources. They were very easily fooled, and that extrapolates to all of us. We think we’re savvy, we think we’re very sharp, but we’re often not. We’re often very easily swayed and not as together as we like to think we are.”
One of the reasons for this is the law of ‘bullshit asymmetry’ which dictates that a lie is much easier to spread and perpetuate than it is to debunk.
“If I made something up about you, it puts you on the back foot. It takes you a lot more energy to refute it. And this is why we should never accept a claim unless someone has proffered enough evidence because, otherwise, it creates a massive workload for everyone.
“But that’s often what happens online. Someone makes up a falsehood with no evidence whatsoever and then we spend a huge amount of our energy debunking it.”
He says it can be seen particularly with vaccination misinformation where people are still working hard to debunk myths and convince people it’s safe to have their children vaccinated.
“One of the ways I break this down is; you’re not just looking for two people with different opinions, because that’s easy to get – you can get two ‘experts’ who disagree. What you’re trying to look for is what the overwhelming weight of the evidence tells you.”
For instance, while there are some people with PhDs who deny climate change or say vaccinations are bad – they are going against the weight of scientific evidence, he says.
“If I go off-piste and start making stuff up, I’m no longer speaking as an expert, I’m actually engaging in charlatan-like behaviour. People have to realise that authority only comes from accurately reflecting the evidence. If an expert is not doing that, which is often the case, then they are not acting as an expert. Unfortunately, that is very hard for people to grasp.”
Dr Grimes says we can’t simply go ahead and change people’s minds for them, we can only give them the tools for them to examine things themselves and potentially change their own minds.
“This is why I think discussion, rather than debate, is a much more conducive way of positively influencing situations.”
However, that only applies when people are arguing and discussing in good faith. If they’re actively making things up, they should be called on it, he says.
One of the defining features of our time is the overload of information available to us, much of which is fake or misleading. He says this can cause us to become burned out and apathetic to news and information, which is the worst possible response.
“Once you’re at a point where you’re no longer going to engage because you’re too cynical, then it’s very easy to be manipulated. You might think it makes us stronger, it doesn’t. Apathy is the enemy here.
“We have to realise that, yes, there’s a lot of noise, but the solution to a lot of noise isn’t just to give up. Because, if we do, it allows people with very ulterior motives to have the upper hand. And we really can’t let tyrants win in that fashion.”