Wed, Dec 25, 2024 | Jumada al-Aakhirah 24, 1446 | DXB ktweather icon0°C

Covid-19: YouTube cracks down on anti-vaccine videos

Top Stories

AFP

AFP

San Francisco - Google-owned platform says misinformation concerns spread beyond the pandemic

Published: Wed 29 Sep 2021, 5:08 PM

Updated: Wed 29 Sep 2021, 5:21 PM

  • By
  • AFP

YouTube said Wednesday it would remove videos that falsely claim approved vaccines are dangerous, as social networks seek to crack down on health misinformation around Covid-19 and other diseases.

Video-sharing giant YouTube has already banned posts that spread false myths around coronavirus treatments, including ones that share inaccurate claims about Covid-19 vaccines shown to be safe.

But the Google-owned site said its concerns about the spread of medical conspiracy theories went beyond the pandemic.

“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general,” the Google-owned website said in a statement.

“We’re now at a point where it’s more important than ever to expand the work we started with Covid-19 to other vaccines.”

The expanded policy will apply to “currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO (World Health Organization).”

ALSO READ:

>> Covid-19: Be vigilant against anti-vaccine rumours, urge experts

>> Facebook rejects Joe Biden's Covid misinformation criticism

It will see false claims about routine immunizations for diseases like measles and Hepatitis B removed from YouTube.

These would include cases where vloggers have claimed that approved vaccines do not work, or wrongly linked them to chronic health effects.

Content that “falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them” will also be taken down.

“As with any significant update, it will take time for our systems to fully ramp up enforcement,” YouTube added.

It stressed there would be exceptions to the new guidelines, with personal testimonials of negative experiences with vaccines still allowed, so long as “the channel doesn’t show a pattern of promoting vaccine hesitancy.”

YouTube said it had removed more than 130,000 videos since last year for violating its Covid-19 vaccine policies.

On Tuesday, the company told German media that it had blocked the German-language channels of Russia’s state broadcaster RT for violating its Covid misinformation guidelines.

YouTube said it had issued a warning to RT before shutting the two channels down, but the move has prompted a threat from Moscow to block the video site.

It is not the only social media giant grappling with dealing with the spread of Covid-19 conspiracy theories and medical misinformation in general.

Facebook this month launched a renewed effort to tackle extremist and conspiracy groups, beginning by taking down a German network spreading Covid misinformation.



Next Story