YouTube will also take down several channels associated with high-profile anti-vaccine figures like Joseph Mercola and Robert F.
If not, the user’s channel will receive a strike if a channel gets three strikes in 90 days, the channel is terminated. If it’s a user’s first time violating community guidelines, YouTube says that they will likely get a warning with no penalty. If a user posts content that violates these guidelines, YouTube will remove the content and let the uploader know why their video was removed. Some examples of content that violates YouTube’s new guidelines include videos that claim vaccines cause chronic side-effects like cancer or diabetes videos that claim vaccines contain devices that can track those who are inoculated or videos asserting that vaccines are part of a depopulation agenda. Twitter even suspended Georgia Representative Marjorie Taylor Greene after she falsely claimed that vaccines and masks do not reduce the spread of COVID-19. Twitter also bans the spread of misleading COVID-19 information and labels tweets that might be misleading by using a combination of AI and human efforts. With these new guidelines, YouTube is following the footsteps of Facebook, which expanded the criteria it uses to take down false vaccine information in February. The White House has even enlisted the help of rising superstars like Olivia Rodrigo to encourage Americans to get vaccinated. But President Biden has pointed to social media platforms as a place where vaccine misinformation spreads. This change in policy comes as COVID-19 vaccination rates slow - in the U.S., about 55% of people are fully vaccinated, but these percentages are larger in countries like Canada and the United Kingdom, which have vaccinated 71% and 67% of people against COVID-19, respectively. The platform previously banned misinformation specific to coronavirus vaccines, but now, its policies are being updated to also block misinformation about routine immunizations, like those for measles and Hepatitis B, as well as general false statements about vaccines that are confirmed safe by local health authorities and the World Health Organization (WHO). Now, YouTube says it will also remove content that spreads misinformation about vaccine safety, the efficacy of vaccines and ingredients in vaccines. The Google-owned video platform had previously banned over 1 million videos spreading dangerous COVID-19 misinformation. YouTube expanded its medical misinformation policies today to include new guidelines that ban vaccine misinformation.