22.5 per cent of anti-vaxxers got information from YouTube videos
Kongens Lyngby [Denmark]: A recent research from the Technical University of Denmark (DTU) has pointed out how misinformation on social media contributes to this distrust and creates a false image of benefits and disadvantages COVID vaccines. The study was published in the journal PLOS one.
“Where vaccine supporters often refer to news media and science sites when sharing knowledge about vaccines on Twitter, we can see that profiles belonging to anti-vaccine profiles far more often share links to YouTube videos and to sites that are known to spread fake news and conspiracy theories, which previous research has also shown,” said Bjarke Monsted, who holds a PhD from DTU Compute.
He continued, “Furthermore, vaccine opponents’ profiles often link to commercial sites that sell alternative health products. This is surprising given that vaccine hesitancy often stems from a fear of financial conflicts of interest. Particularly, because previous research has shown that 12 people globally are responsible for vaccine misinformation, including people who earn a fortune from the sale of alternative health products.”
Along with Sune Lehmann from the Research Section for Cognitive Systems at DTU Compute, Monsted analysed some 60 billion tweets written before the pandemic to understand where the discussion about vaccines takes place on Twitter to better understand today’s vaccine hesitancy on social media. In doing so, they identified the users who consistently expressed strong views in favour of (provax) or against vaccines (antivax) and from which sources the profiles shared their vaccine information.
Their work shows that 22.5 per cent of antivax-profiles’ vaccine tweets link to YouTube videos. The researchers then grouped the sources into five categories: Sites known for sharing pseudoscience and conspiracy theories, news sites, social media, YouTube (which was given its own category due to a large number of links), and finally, commercial sites relating to medicine and health.
The research confirmed the echo chamber effect, making it hard for vaccine advocates and opponents to encounter each other’s views on the internet, because social media algorithms ensure that people interact with others whose opinions align with their own.
“In fact, we discovered that the sources of information, which people are exposed to in their social networks, depend heavily on their own attitudes towards vaccines. The more resistance to vaccines a user expressed, the further from the norm was the media picture they were exposed to from their circle of friends,” Monsted said.
“Research clearly shows that combating misinformation is a joint responsibility. It is important that media outlets do not create a false balance between views giving equal, or maybe even more, airtime to anti-vaccine arguments that are not substantiated by the scientific literature. Media should not portray medical information and misinformation as equivalent views,” Monsted said.
Lehmann and Monsted developed artificial intelligence methods called ‘deep learning’ and ‘natural language processing’, to identify which views on vaccines were expressed in a given tweet. The researchers hope the method can provide a greater understanding of the vaccine discussion during the pandemic and in the future.
“(Vaccines have) gained a lot of attention and been overturned in a completely new way in the last two years. They have gone from being a topic that was primarily discussed among particular population groups to becoming a markedly more mainstream topic,” Lehmann said. “The exciting challenge going forward will be to use our methodological innovations to understand whether–and how–this shift has changed the discussion and the various actors’ motives,” he concluded.
With inputs from ANI