YouTube removes 1 million videos with COVID-19 misinformation since February 2020 – All you need to know

0
325

@the_news_21

Google-owned YouTube has removed 1 million videos related to dangerous Covid-19 misinformation, like false cures or claims of a hoax, since February 2020.

According to YouTube’s Chief Product Officer Neal Mohan, if we only focus on what we remove, we’re missing the massive amount of content that people actually see.

“Bad content represents only a tiny percentage of the billions of videos on YouTube (about .16-.18 percent of total views turn out to be content that violates our policies),” he said in a blog post on Wednesday.

“Misinformation has moved from the marginal to the mainstream. No longer contained to the sealed-off worlds of Holocaust deniers or 9-11 truthers, it now stretches into every facet of society, sometimes tearing through communities with blistering speed,” he emphasised.

He added that YouTube removes almost 10 million videos each quarter, “the majority of which don’t even reach 10 views.”

“Speedy removals will always be important but we know they’re not nearly enough. Instead, it’s how we also treat all the content we’re leaving up on YouTube that gives us the best path forward,” he said.

Facebook also had recently made a similar argument about content on its platform when faced with criticism over its handling of COVID-19 and vaccine misinformation. The company argued that vaccine misinformation isn’t representative of the kind of content most users see.

YouTube is ratcheting up information from trusted sources and reducing the spread of videos with harmful misinformation.

“For COVID, we rely on expert consensus from health organisations like the CDC and WHO to track the science as it develops. In most other cases, misinformation is less clear-cut,” Mohan said.

Videos that violate the vaccine policy, according to YouTube’s rules, are those that contradict expert consensus on the vaccines from health authorities or the World Health Organisation (WHO).

Other platforms, including Facebook and Twitter, have also rolled out policies to reduce the spread and reach of such content.

Both Facebook and YouTube have come under particular scrutiny for their policies around health misinformation during the pandemic.

LEAVE A REPLY

Please enter your comment!
Please enter your name here