MIT Technology Review’s How To series helps you get things done.
Someone I know became a conspiracy theorist seemingly overnight.
It was during the pandemic, and out of nowhere, they suddenly started posting daily on Facebook about the dangers of covid vaccines and masks, warning of an attempt to control us and keep us in our places. The government had planned it all; it was part of a wider plot by a group of shadowy pedophile elites who ran the world. The World Economic Forum was involved in some way, and Bill Gates, natch. The claims seemed to get wilder by the day. I didn’t always follow.
As a science and technology journalist, I felt that my duty was to respond. So I did, occasionally posting long debunking responses to their posts. I thought facts alone (uncertain as they were at the time) would help me win the argument. But all I got was derision. I was so naive, apparently. I eventually blocked this person for the sake of my own mental health.
This story is part of MIT Technology Review’s series “The New Conspiracy Age,” on how the present boom in conspiracy theories is reshaping science and technology.
Over the years since, I’ve often wondered: Could I have helped more? Are there things I could have done differently to talk them back down and help them see sense?
I should have spoken to Sander van der Linden, professor of social psychology in society at the University of Cambridge. He is the author of Foolproof, a book about misinformation and how we make ourselves less susceptible to it.
As part of MIT Technology Review’s package on conspiracies, I gave him a call to ask: What would he advise if one of our family members or friends showed signs of having fallen down the rabbit hole?
Step 1:
Start with “pre-bunking”
The best way to avoid the conspiracy theory vortex is, of course, not to set foot in there in the first place. That’s the idea behind “pre-bunking,” an approach to dealing with conspiracies that works a lot like vaccination (the irony) against disease. By getting “inoculated” with knowledge about how conspiracy theories work, we become better prepared to spot the real thing when we come across it.
The concept stems from work in the 1960s by the social psychologist William McGuire, who was looking for ways to protect US soldiers from being indoctrinated by enemies. He came up with the idea of a “vaccine for brainwash.”
“Conspiracy theorists tend to negatively react to debunking and fact-checking … they become more aggressive and sort of double down in their beliefs,” says van der Linden. “But with the pre-bunking approach, they seem to be open to entertaining it.”
One of the most effective means of pre-bunking is to refrain from arguing about the facts of the matter and, instead, simply show people how they might be manipulated. This works best as part of a wider media literacy campaign, if you can reach people before they’re exposed to misinformation and conspiracy theories, but he says pre-bunking can also work as a therapy for people who are already partly radicalized. (As with an infection, it’s always better to avoid catching it in the first place than to treat the symptoms later, the thinking goes.)
The idea is to help people understand what rhetorical techniques have been used on them. It gives them the chance to think about how they might have been tricked. Maybe they fell for emotional storytelling (using emotional cues to reduce someone’s inclination to critically assess the core claims) or false dichotomies (making it appear there are only two sides to a topic, and you have to choose one). “One of the things we found is that conspiracy theorists hate manipulation, and they hate the idea of being manipulated,” van der Linden says.
“I kind of zoom out and deconstruct the manipulation techniques [and ask], Who’s benefiting from this? Who’s making money off of it? What are their incentives? And can you be duped by this?”
To scale this approach, he and his colleagues worked with Google Jigsaw (which focuses on projects aimed more or less at the public good) to produce pre-bunking videos that were posted on YouTube. They also created various online pre-bunking games that can expose common deceptions, including Bad Vaxx, launched this summer, which helps expose some misinformation techniques often used in the antivaccine community. In a study published in August, the game was shown to be highly effective at improving people’s ability to spot misinformation.
Step 2:
Validate some aspects of their worldview
The next approach might seem strange to some. Essentially, you have to agree with the conspiracy believer, at least a little bit.
“Generally, if you want to start a conversation with people, it goes better when you first validate [their] worldview before you raise a challenging argument or point,” he says.
The way you do this is to address the fact that, in some cases, conspiracies have proved to be real. Watergate was a real conspiracy. Pharmaceutical companies have been shown to conspire to defraud the public in the past. But that doesn’t mean every conspiracy theory is true.
“You’re first validating their viewpoint that bad people sometimes conspire. And then you say, okay, but not this one,” says van der Linden, who calls this a “gateway.”
By offering recognition that conspiracies exist, you let people know that you’re not rejecting everything they say—your issue is more with one specific belief.
“Look, financial fraud happens, right? And there’s forensic accountants and other people who detect and prosecute conspiracies like that,” he says. “But people in their basement googling, you know, satanic pedophile conspiracies are not going to arrive at real evidence. And so there’s a differentiation.”
Step 3:
Talk to them about where the scientific or social consensus lies
One of the problems with conspiracy theories in the age of social media is it’s very easy to reaffirm your new beliefs, find communities that believe them too, and then interact only with those people. Very quickly, one can start to think a particular theory is more widely believed than is really the case.
It can be helpful to let conspiracy theorists understand that their view is a pretty far-out one, or at least not widely held among experts. If you can present the true scientific consensus on a topic (for example, the overwhelming majority of climate scientists believe that anthropogenic climate change is real and an existing threat), that can have an effect on certain at-risk people.
“Most people don’t like to hold views that are extremist,” he says. “So when people realize that their views are far outside of the norm, they don’t like that.”
The approach has mixed success, but he says it can be particularly effective when discussing conspiracy theories around scientific issues, such as climate or vaccinations.
However, he emphasizes that this really works well only for people who are merely flirting with conspiracy theories but are not yet too far gone. For those who are fully committed to the theory, this kind of intervention might fall on deaf ears.
“It works less well for die-hard conspiracy theorists because they’re motivated by this need for uniqueness, like everyone else is the ‘sheeple’ and they want to be unique, and so being different from the norm is actually what gives them motivation,” he cautions.
Step 4:
Show them examples of others who have broken out of conspiracy thinking
In extreme cases, hearing from or about someone who was deeply radicalized but subsequently broke free can be extremely effective, says van der Linden.
In his work with conspiracy theorists, he often borrows quotes or stories from former believers or those who have been under the control of a cult.
For example, Brent Lee, a former 9/11 truther and someone who had fully bought into an array of conspiracy theories, now spends his time trying to help other conspiracy theorists see the problems with their beliefs, speaking at conferences and on podcasts about his time in that world.
Someone “who used to be in those groups,” says van der Linden, “is much more persuasive, sometimes, than any scientist or outsider.”
Step 5:
Let them know you care—and watch for isolation
Lastly, just being aware of changes in the behavior of your family and friends can be vital.
Warning signs include becoming noticeably close-minded about explanations for things that are happening around them. “When people start to sort of switch off from other explanations in the world,” says van der Linden, “that’s kind of the usual path to becoming more radical.”
Another major predictor is when people start showing low faith in official outlets, he says. “When people start losing trust in mainstream media, in official explanations, that pulls them toward alternative sources that usually spread conspiracy theories.”
It’s worth keeping an eye out in case loved ones are becoming isolated from others around them, something that is often a red flag. If you’re at risk of becoming radicalized online, you need people around you who are “constantly distracting you and kind of questioning this stuff and [who can] bring you back to reality,” says van der Linden.
“What I’ve learned is the best way to keep people from radicalizing is actually by staying in touch, because the main thing that happens is that they start isolating themselves because they have fringe beliefs, and then they become more extreme, and they lose more trust, and that makes them more vulnerable to radicalization.
“So actually, just getting people out away from their computer and doing social things and staying in touch with them regularly is one of the best defenses,” he says.
Finally, if you get a chance to sit down and talk to the family member or friend you’re trying to help, one approach can help break through: Let them know you care about their well-being, and that’s why you’re there. Show that while you don’t agree with this particular belief, that doesn’t change how you feel about them.
“Just to say, ‘Look, you’re my brother, you’re my sister, my family member. I love you. I care about you,’” says van der Linden. “You need some sort of validation.”

8 hours ago
1
English (US)