Misinformation has become one of the most pressing problems we face as a society. How can we find effective solutions when we can’t agree on what the facts are? And misinformation has entered just about every domain in our lives. From our healthcare to our choices as voters, how can we differentiate true from false?
At least when it comes to politics, researchers have found that simply playing a video game can help. A new video game teaches players to how spot misinformation by letting them attempt to undermine democracy. In other words, the more people learn how to create political division in a game, the more they recognize the methods when they see them in real life.
The game is called Harmony Square, and it’s free to play. I thought I’d give it a whirl. At first it was amusing when I was hired as Chief Disinformation Officer and had to choose my code name. That’s when my new employers explained more about my job, “We hired you to sow discord and chaos on Harmony Square.”
After that, the game walked me through five key disinformation techniques, starting with posting an extreme opinion. Here’s what the game said about it, “See? Just by posting an extreme opinion, you got people to act on emotion. You made them think your opinions are representative of a broader group of Harmony Square residents. We call this ‘rage bait’.”
By the end of the game I’d learned how to play both sides and use bots. By the time I had destroyed the town of Harmony Square, the game stopped being fun. But I also had a better sense on how to recognize disinformation campaigns on social media.
“Pre-bunking” works better than debunking
And that’s the goal of the game. Created by psychologists at the University of Cambridge, the game comes with support from the US Department of State’s Global Engagement Center and Department of Homeland Security Cybersecurity and Infrastructure Security Agency (CISA).
The idea is this: when players learn how to create political division with conspiracy theories, fake experts, bots and other techniques, they learn how to spot those techniques in real life. It’s a kind of ‘psychological vaccine’ based on something called ‘inoculation theory’: that when people are exposed to common fake news techniques they can better notice and ignore misinformation when they see it.
It’s related to a method called ‘pre-bunking.’ “Trying to debunk misinformation after it has spread is like shutting the barn door after the horse has bolted. By pre-bunking, we aim to stop the spread of fake news in the first place,” said Dr Sander van der Linden, Director of the Cambridge Social Decision-Making lab and senior author of the new study in a press release.
Pre-bunking is being used more and more widely. For instance, Twitter started flagging tweets that are misinformation during the U.S. presidential election. But the researchers argue that flagging individual tweets only goes so far. Instead of rebutting every individual conspiracy theory, the researchers believe it’s better to build a ‘general inoculation’ by teaching people how misinformation techniques work. In other words, when people see how they are being manipulated, they are less likely to need to rebut every single false conspiracy theory.
The study’s results
For the study, researchers asked 681 people to rate news and social media posts for reliability. Some were real, some were misinformation, and some were fakes misinformation created just for the study. Then they had half the participants play Harmony Square and the other half play Tetris. After that participants rated another series of posts.
The group who played the 4 levels in Harmony Square learned about 5 common manipulation techniques. According to the study these were:
- “Trolling people, i.e., deliberately provoking people to react emotionally, thus evoking outrage.
- Exploiting emotional language, i.e., trying to make people afraid or angry about a particular topic.
- Artificially amplifying the reach and popularity of certain messages, for example through social media bots or by buying fake followers.
- Creating and spreading conspiracy theories, i.e., blaming a small, secretive and nefarious organization for events going on in the world.
- Polarizing audiences by deliberately emphasizing and magnifying inter-group differences.”
Study participants who played Harmony Square were 16% less likely to rate misinformation as reliable than they had been prior to playing. They were also 11% less likely to share fake news. And this was regardless of their personal political affiliation.
By comparing the group that played Tetris with the group that played Harmony Square, the researchers found an effect size of 0.54. “The effect size suggests that if the population was split equally like the study sample, 63% of the half that played the game would go on to find misinformation significantly less reliable, compared to just 37% of the half left to navigate online information without the inoculation of Harmony Square,” said Van der Linden.
The project is part of a series by the Department of Homeland Security Cybersecurity and Infrastructure Security Agency to who people how ‘foreign influencers’ use disinformation. And it is not the only game like this. While Harmony Square looks at political disinformation campaigns, the game Go Viral! is a five-minute game that “helps protect you against Covid-19 misinformation,” and was designed by the same team.
Harmony Square itself takes ten minutes to play and “is quick, easy and tongue-in-cheek, but the experiential learning that underpins it means that people are more likely to spot misinformation, and less likely to share it, next time they log on to Facebook or YouTube,” said Dr Jon Roozenbeek, lead author of the study.