Shortly after the Russian invasion, the pranks began. Ukrainian refugees took jobs, committed crimes and abused the handouts. The disinformation quickly spread online across Eastern Europe, sometimes promoted by Moscow in an effort to destabilize its neighbors. It is the kind of rapid spread of falsehoods that has been blamed in many countries for increased polarization and erosion of trust in democratic institutions, journalism and science. But countering or stopping misinformation has proven elusive. New findings from university researchers and Google, however, reveal that one of the most promising answers to misinformation may also be one of the simplest. In a paper published Wednesday in the journal Science Advances, researchers describe how short online videos that teach basic critical thinking skills can make people better able to resist misinformation. The researchers created a series of videos similar to a public service announcement that focused on specific disinformation techniques — features that appear in many common false claims that include emotionally charged language, personal attacks or false comparisons between two unrelated items. The researchers then gave people a series of claims and found that those who watched the videos were much better at distinguishing false information from accurate information. It’s an approach called “priming” and is based on years of research into an idea known as inoculation theory that suggests exposing people to how misinformation works, using innocuous, fictional examples, can strengthen their defenses against false claims. With the findings in hand, Google plans to soon release a series of pre-arrangement videos in Eastern Europe focusing on the scapegoat that appears in much of the misinformation about Ukrainian refugees. This focus was chosen by Jigsaw, a division of Google that works to find new ways to counter disinformation and extremism. “We’ve spent a lot of time and energy studying the problem,” said Beth Goldberg, Jigsaw’s head of research and one of the paper’s authors. “We started thinking: How can we make users, people online, more resistant to misinformation?” The two-minute clips then show how these tactics can appear in headlines or social media posts to trick a person into believing something that isn’t true. They are surprisingly effective. People who watched the videos were found to be much better at distinguishing false claims from accurate information when tested by the researchers. The same positive results were seen when the experiment was repeated on YouTube, where nearly 1 million people viewed the videos. Researchers are now investigating how long the effects last and whether video “boosters” can help maintain the benefits. Previous findings have suggested that online games or tutorials that teach critical thinking skills can also improve resilience to misinformation. But videos, which could be played alongside online ads, are likely to reach far more people, said Jon Roozenbeek, a professor at the University of Cambridge and one of the study’s authors. Other authors included researchers at the University of Bristol in the UK and the University of Western Australia. Google’s effort will be one of the largest real-world preliminary mass tests to date. The videos will be released on YouTube, Facebook and TikTok in Poland, the Czech Republic and Slovakia. All three countries have received large numbers of Ukrainian refugees, and their citizens may be vulnerable to misinformation about refugees. Jigsaw CEO Yasmin Green said the prebunking work is meant to complement Google’s other efforts to reduce the spread of misinformation: “As the scourge of misinformation grows, there is much more we can do to provide people prompts and features that help them stay safe and updated online.” While journalistic fact-checking can be effective in debunking a particular misinformation, it takes time and effort. By focusing on misinformation characteristics in general rather than specific claims, pre-engagement videos can help an individual identify false claims on a wider variety of topics. Another method, content moderation by social media companies, can often be inconsistent. While platforms like Facebook and Twitter often remove misinformation that violates their rules, they are also criticized for not doing more. Other platforms like Telegram or Gab have a largely no-nonsense approach to disinformation. Social media content control and journalistic fact-checking can also run the risk of alienating those who believe misinformation. They may also be ignored by people who already distrust the legitimate media. “The very word fact-checking has become politicized,” Roozenbeek said. However, the preliminary videos do not target specific claims and make no claims about what is true or not. Instead, they teach the viewer how false claims generally work — whether it’s a claim about an election, or NASA’s moon landings, or the latest bird flu outbreak. This portability makes pre-bunking a particularly effective way to counter misinformation, according to John Cook, a research professor at Australia’s Monash University who has created online games that teach ways to spot misinformation. “We’ve done enough research to know this can be effective,” Cook said. “What we need now is the resources to develop it at scale.”