Protect yourself from misinformation?
Throughout history, many smart people have fallen for misinformation. It is often hard to protect yourself from misinformation. As a result, few actually try.
However, while it is often hard to protect yourself from misinformation, it is not impossible. And in fact, by developing a few basic skills, you can go along way to fight “Fake News.”
Everyone uses information. It is our nature to do so.
Unfortunately, because of the way the brain works, our thinking is biased, limited in knowledge, and subject to all sorts of day-by-day, minute-by-minute influences from the food we eat to the weather and social events happening around us.
Yet, our success, our ability to thrive, depends heavily on the quality of of the information we use to make decisions.
Poor information is costly, in any number of ways. Fortunately, “High-Quality Information” can be obtained through a few simple skills.
Just as you learn to ride a bike or use the Internet, you can easily learn to improve the information you process.
And, the great thing about learning to improve the information you use is you can do it throughout your life.
I can continue to learn to improve the information I use until the day I die.
Everything I am trying to learn right now centers around my belief that I can continually improve the information I use.
I live in a constant state of believing that the next thing I learn could change my thinking for the better.
Skill #1 – Develop a “consistent” Thinking Process
The actual “process” you use to doesn’t matter as much as that it’s applied consistently.
Let’s say your goal is to maintain a weight of 170. It doesn’t matter that your home scale might be off by a pound. What matters is that it is always off by that same pound.
If you are just trying to be healthy, and your goal is to maintain a healthy weight then it doesn’t matter the “accuracy” of the scale, as long as the scale measures the same thing every time you step on it.
The key metric you are looking for here is “change” not “accuracy.”
I use the “TIDAL” approach to decision-making. (Click here to go to that section)
Applying this process constantly actually helps others too, as well s me.
If you apply your decision process consistently, then those that may want to help you can have a better idea as to the areas they could help in.
For example, Their first question might be, “What is your Intention?” “What are you trying to achieve?” “What are your goals?”
Or they might ask, “what have you learned?
Of course, I am constantly asking myself those same questions. And it would be great if everyone in the communication understands those things.
Even as I write this, I am asking myself where am I in this process?
- Do I know what I trying to say as I write this (Target)?
- Do I have the information I need to write this (Information)?
- Have I decided what to write (Decision)?
- Am I actually writing this (Action)?
- What have I learned from writing this (Learning.)?
At this point, I would pause and ask myself did I achieve the “target?” And, I would ask myself what did I learn from this process?
If my target was to write a brief overview of the Tidal process, I’ve just re-read this and I made a few changes.
And, as always, my goal is to learn, I think I learned a lot just writing these sentences. (Now I need a break to think about this.)
Skill #2 – Develop a way to “Triangulate on truth.”
Rather than try to find “unbiased” information, what I do is acknowledge the bias and look for information coming from different biases.
Then, I can compare and contrast the differences and similarities to see if I can get closer to the “truth.”
I call this, “Triangulating on the Truth.”
One way I do this is using a Media Literacy test (Click here to go to that section).
Here is a high-level overview of the 5 Questions.
Five Core Concepts
Five Key Questions
Other Ways to Protect Yourself from Misinformation.
Part of the problem arises from the nature of the messages themselves.
We are bombarded with information all day, every day, and we therefore often rely on our intuition to decide whether something is accurate. As BBC Future has described in the past, purveyors of fake news can make their message feel “truthy” through a few simple tricks, which discourages us from applying our critical thinking skills – such as checking the veracity of its source. As the authors of one paper put it: “When thoughts flow smoothly, people nod along.”
Eryn Newman at Australian National University, for instance, has shown that the simple presence of an image alongside a statement increases our trust in its accuracy – even if it is only tangentially related to the claim. A generic image of a virus accompanying some claim about a new treatment, say, may offer no proof of the statement itself, but it helps us visualize the general scenario. We take that “processing fluency” as a sign that the claim is true.
For similar reasons, misinformation will include descriptive language or vivid personal stories. It will also feature just enough familiar facts or figures – such as mentioning the name of a recognized medical body – to make the lie within feel convincing, allowing it to tether itself to our previous knowledge.
Even the simple repetition of a statement – whether the same text, or over multiple messages, can increase the “truthiness” by increasing feelings of familiarity, which we mistake for factual accuracy. So, the more often we see something in our news feed, the more likely we are to think that it’s true – even if we were originally skeptical.
Sharing before thinking
These tricks have long been known by propagandists and peddlers of misinformation, but today’s social media may exaggerate our gullible tendencies. Recent evidence shows that many people reflexively share content without even thinking about its accuracy.
Gordon Pennycook, a leading researcher into the psychology of misinformation at the University of Regina, Canada, asked participants to consider a mixture of true and false headlines about the coronavirus outbreak. When they were specifically asked to judge the accuracy of the statements, the participants said the fake news was true about 25% of time. When they were simply asked whether they would share the headline, however, around 35% said they would pass on the fake news – 10% more.
“It suggests people were sharing material that they could have known was false, if they had thought about it more directly,” Pennycook says. (Like much of the cutting-edge research on Covid-19, this research has not yet been peer-reviewed, but a pre-print has been uploaded to the Psyarxiv website.)
Perhaps their brains were engaged in wondering whether a statement would get likes and retweets rather than considering its accuracy. “Social media doesn’t incentivize truth,” Pennycook says. “What it incentivizes is engagement.”
Or perhaps they thought they could shift responsibility on to others to judge: many people have been sharing false information with a sort of disclaimer at the top, saying something like “I don’t know if this is true, but…”. They may think that if there’s any truth to the information, it could be helpful to friends and followers, and if it isn’t true, it’s harmless – so the impetus is to share it, not realizing that sharing causes harm too.
Whether it’s promises of a homemade remedy or claims about some kind of dark government cover-up, the promise of eliciting a strong response in their followers distracts people from the obvious question.
This question should be, of course: is it true?
Classic psychological research shows that some people are naturally better at overriding their reflexive responses than others. This finding may help us understand why some people are more susceptible to fake news than others.
Researchers like Pennycook use a tool called the “cognitive reflection test” or CRT to measure this tendency. To understand how it works, consider the following question:
- Emily’s father has three daughters. The first two are named April and May. What is the third daughter’s name?
Did you answer June? That’s the intuitive answer that many people give – but the correct answer is, of course, Emily.
To come to that solution, you need to pause and override that initial gut response. For this reason, CRT questions are not so much a test of raw intelligence, as a test of someone’s tendency to employ their intelligence by thinking things through in a deliberative, analytical fashion, rather than going with your initial intuitions. The people who don’t do this are often called “cognitive misers” by psychologists, since they may be in possession of substantial mental reserves, but they don’t “spend” them.
Cognitive miserliness renders us susceptible to many cognitive biases, and it also seems to change the way we consume information (and misinformation).
When it came to the coronavirus statements, for instance, Pennycook found that people who scored badly on the CRT were less discerning in the statements that they believed and were willing to share.
Matthew Stanley, at Duke University in Durham, North Carolina, has reported a similar pattern in people’s susceptibility to the coronavirus hoax theories. Remember that around 13% of US citizens believed this theory, which could potentially discourage hygiene and social distancing. “Thirteen percent seems like plenty to make this [virus] go around very quickly,” Stanley says.
Testing participants soon after the original YouGov/Economist poll was conducted, he found that people who scored worse on the CRT were significantly more susceptible to these flawed arguments.
These cognitive misers were also less likely to report having changed their behavior to stop the disease from spreading – such as handwashing and social distancing.
Stop the spread
Knowing that many people – even the intelligent and educated – have these “miserly” tendencies to accept misinformation at face value might help us to stop the spread of misinformation.
Given the work on truthiness – the idea that we “nod along when thoughts flow smoothly” – organizations attempting to debunk a myth should avoid being overly complex.
Instead, they should present the facts as simply as possible – preferably with aids like images and graphs that make the ideas easier to visualize. As Stanley puts it: “We need more communications and strategy work to target those folks who are not as willing to be reflective and deliberative.” It’s simply not good enough to present a sound argument and hope that it sticks.
If they can, these campaigns should avoid repeating the myths themselves. The repetition makes the idea feel more familiar, which could increase perceptions of truthiness. That’s not always possible, of course. But campaigns can at least try to make the true facts more prominent and more memorable than the myths, so they are more likely to stick in people’s minds. (It is for this reason that I’ve given as little information as possible about the hoax theories in this article.)
When it comes to our own online behavior, we might try to disengage from the emotion of the content and think a bit more about its factual basis before passing it on. Is it based on hearsay or hard scientific evidence? Can you trace it back to the original source? How does it compare to the existing data? And is the author relying on the common logical fallacies to make their case?
These are the questions that we should be asking – rather than whether or not the post is going to start amassing likes, or whether it “could” be helpful to others. And there is some evidence that we can all get better at this kind of thinking with practice.
Pennycook suggests that social media networks could nudge their users to be more discerning with relatively straightforward interventions. In his experiments, he found that asking participants to rate the factual accuracy of a single claim primed participants to start thinking more critically about other statements, so that they were more than twice as discerning about the information they shared.
In practice, it might be as simple as a social media platform providing the occasional automated reminder to think twice before sharing, though careful testing could help the companies to find the most reliable strategy, he says.
There is no panacea. Like our attempts to contain the virus itself, we are going to need a multi-pronged approach to fight the dissemination of dangerous and potentially life-threatening misinformation.
And as the crisis deepens, it will be everyone’s responsibility to stem that spread.