Originally published on www.childrenshealthdefense.org by The Epoch Times
In a paper published Wednesday in Science Advances, researchers from Google and Cambridge University in the U.K. teamed up to conduct experiments aimed at "inoculating people against manipulation techniques commonly used in misinformation."
Tech giant Google is testing out "prebunking" strategies aimed at "inoculating people against manipulation" and misinformation online.
In a paper published Wednesday in the journal Science Advances, researchers from Google and Cambridge University in the United Kingdom teamed up to conduct experiments that involved five short videos aimed at "inoculating people against manipulation techniques commonly used in misinformation."
The study, titled, "Psychological Inoculation Improves Resilience Against Misinformation on Social Media," involved nearly 30,000 participants. Other authors included researchers at the University of Bristol in the United Kingdom and the University of Western Australia.
Researchers say the manipulation techniques commonly used in misinformation are "emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks."
Specifically, researchers showed 90-second videos aimed at familiarizing watchers with manipulation techniques such as scapegoating and deliberate incoherence.
The videos introduce concepts from the "misinformation playbook," according to researchers, and explained to viewers in simple terms some of the most common manipulation techniques using fictional characters as opposed to real political or media figures.
Researchers then gave people a "micro-dose" of misinformation in the form of relatable examples from film and TV such as Family Guy.
They found that the videos "improved manipulation technique recognition" and boosted watchers' confidence in spotting these techniques, while also "increasing people's ability to discern trustworthy from untrustworthy content." The videos also "improve the quality of their sharing decisions," researchers said.
'Effective at improving misinformation resilience'
"These effects are robust across the political spectrum and a wide variety of covariates," they wrote. "We show that psychological inoculation campaigns on social media are effective at improving misinformation resilience at scale."
"Online misinformation continues to have adverse consequences for society," the study states.
"Inoculation theory has been put forward as a way to reduce susceptibility to misinformation by informing people about how they might be misinformed, but its scalability has been elusive both at a theoretical level and a practical level."
Among the "misinformation" cited by researchers in the study is that relating to the COVID-19 virus. Authors say such "misinformation" has "been linked to reduced willingness to get vaccinated against the disease and lower intentions to comply with public health measures."
Multiple studies have shown that vaccines are linked to two types of heart inflammation: myocarditis and pericarditis, and U.S. authorities have acknowledged a link between the Pfizer and Moderna vaccines and heart inflammation.
However, they state that the benefits of the shots outweigh the risks.
The authors in the study compared the videos to vaccines, stating that by giving people a "micro-dose" of misinformation in advance, it helps prevent them from being susceptible to it in the future, or "inoculates" them, much like medical inoculations build resistance against pathogens.
'Works like a vaccine'
The idea is based on what social psychologists call "inoculation theory" -- building resistance to persuasion attempts via exposure to persuasive communications that can be easily refuted.
Google is already harnessing the findings and plans to roll out a "prebunking campaign" across several platforms in Poland, Slovakia and the Czech Republic in an effort to stem emerging disinformation relating to Ukrainian refugees.
The campaign is in partnership with local nongovernmental organizations, fact-checkers, academics and disinformation experts.
Lead author Dr. Jon Roozenbeek from Cambridge's Social Decision-Making Lab said in a press release:
"Our interventions make no claims about what is true or a fact, which is often disputed. They are effective for anyone who does not appreciate being manipulated.
"The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education, and different personality types."
"YouTube has well over two billion active users worldwide. Our videos could easily be embedded within the ad space on YouTube to prebunk misinformation," said study co-author Prof. Sander van der Linden.
Disqus