YouTube has removed 1 million videos due to serious misinformation about the coronavirus since February 2020. Including those promoting wrong treatments or claims that the epidemic is a hoax.
However, the platform has facilitated the spread of a great deal of misinformation about the coronavirus. Last May, for example, a controversial anti-vaccination video called Plandemic was viewed more than 7 million times before it was removed.
YouTube’s chief product officer, Neil Mahon, shared the statistic in a blog post explaining how the company handles misinformation across its platform.
Size is the biggest challenge the platform faces in this regard, and with so many active people on the platform at all times, it is difficult for it to act quickly enough against everything at the right time.
A small delay can increase views by millions of viewers and have a much greater impact.
“Bad content is only a small percentage of the billions of videos on the platform,” Mahone wrote. He added that YouTube removes nearly 10 million videos every quarter, most of which do not even reach 10 views.
Facebook recently made a similar argument about content across its platform. The social network published a report last week that claimed that the most popular posts were comics and other non-political content.
Meanwhile, the company executive argued that bad content represented only a small percentage of the platform’s content overall.
In the face of criticism over its handling of the coronavirus and misinformation about the vaccine, the company has argued that misinformation about the vaccine is not the kind of content most users see.
Both Facebook and YouTube have come under special scrutiny for policies around misinformation during the pandemic. Both platforms have over a billion users, which means that even a small piece of content can have a far-reaching impact.
Both platforms have so far refused to reveal details about how misinformation about vaccines and health spreads or how many users see it.