Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Entertainment

YouTube Cracks Down On Anti-vaccine Content

For decades, the subject of vaccines has been marked with controversies, with all sorts of misinformation and propaganda being channelled through social networks. However, in the wake of a global pandemic, there has been a surge in anti-vaccine influencers and crusaders spreading misinformation about vaccines. As a result, there’s a lot of concern that social media tech giants have not been doing enough to regulate false health information being disseminated via their platforms.

And by the way, vaccines are not the only subject that’s usually bombarded with misinformation on today’s social media platforms. In the entertainment sector, there’s usually the question of the fairness of online casino games, for example, and whether the games are even safe to play in the first place compared to their land-based counterparts. A platform like the online casino Vulkan Vegas, for instance, is 100% safe and secure as it has been certified for fairness. In addition, the gaming site holds an operating license from Curacao eGaming, one of the world’s leading online gambling regulators.

So, while there’s a genuine concern for scammers and illegitimate gaming sites, there are still a lot of legally regulated, fair and trustworthy online casinos. Essentially, you just have to confirm the legitimacy of a gaming site before you start playing on it. The same goes for vaccines: there is already a selection of medically approved vaccines that are completely safe for use. And so, it’s not fair for anti-vaxxers to mislead online users about products that have been designed to save the world from the global pandemic.

YouTube Doesn’t Want to Take Any Chances

Given the risk of damage that misinformation about vaccines can cause, YouTube has gone a step further and announced that it would be taking down any videos purported to be promoting propaganda regarding any approved vaccines. Initially, the ban had only been applied to anti-covid vaccine video content. However, with anti-vaxxers taking advantage of the loophole to still target Covid19 vaccines indirectly, Google reconsidered its stance on the matter.

Anti-Vax Content Has Attracted Millions Of Views

The rapid spread of disinformation can be attributed to the effectiveness of big companies’ algorithms which target users for specific content based on their web activities. For example, the YouTube algorithm is designed to effectively direct video content to users according to their watching habits. Thus, followers and subscribers of anti-vax content will have more anti-vax content suggested due to the algorithm. Also, channels that regularly churn out content get recommended more than relatively dormant channels, encouraging activists and influencers to keep producing more and more videos and podcasts.

While anti-vaccine campaigns have been going on for years, there has been a colossal increase in the number of protests against the Coronavirus vaccines in the past year. This includes the so-called ‘Disinformation Dozen,’ a group of 12 individuals that have generated two-thirds of all anti-vaccine content and media that has been widely circulated on Twitter and Facebook between 2020 and 2021 alone. The Center for Countering Digital Hate (CCDH) identified the 12 most notorious anti-vax activists by analyzing anti-vaccine content on social media platforms and tracking the sources. Some of the most popular anti-vax influencers whose channels have been hit or shut down include Joseph Mercola, Robert F. Kennedy, Sherry Tenpenny, Rashid Buttar and Ty & Charlene Bollinger.

Content That Will Be Removed from YouTube

According to the current Medical Misinformation Policy by YouTube, any content that is viewed as containing information that ‘poses a serious risk of egregious harm’ to the audience will have violated Community Guidelines. In addition, YouTube quoted that any content containing false statements about vaccine safety, efficacy or ingredients will be flagged and removed from the platform.

This refers to content that contradicts reputable health authorities and/or organizations like WHO. Content alleging that approved and administered vaccines cause chronic side effects like cancer, diabetes or infertility, or that the vaccines do not reduce or prevent the risk of contracting associated diseases will be taken down. Videos by wellness influencers containing false allegations about the ingredients used in vaccines have also been blocked.

Does this mean that users cannot narrate and comment on their personal experiences with vaccines? No. Users are allowed to give testimonials regarding vaccines as long as Community Guidelines are adhered to. However, those that display a habit of posting content promoting vaccine hesitancy will have their videos deleted by YouTube.

Final Thoughts

Social media and video content sites have become the go-to sources of information for most people today. However, it comes as no surprise that the freedom given to content creators to disseminate all kinds of information has created a breeding ground for conspiracy theories and misinformation.

Now more than ever, there’s a need to regulate content surrounding sensitive subjects that could affect life-and-death situations. This ban comes just as Pfizer waits for the FDA to approve its new Pfizer-BioNTech vaccine, which is meant for kids aged between 5 and 11. The YouTube ban on anti-vaccine misinformation is the first step in nipping fake news from different sources, and we may just see other sites follow suit.

Back to top button