YouTube has removed more than a million videos containing “dangerous misinformation about the coronavirus” since the outbreak of the coronavirus pandemic. The video platform of internet giant Google has announced this.
“We remove nearly 10 million videos every quarter, the majority of which don’t get ten views,” YouTube’s Neal Mohan said in a press release. But “if we only look at what we delete, we miss the mountains of content that people see,” he emphasizes.
Overall, between 0.16 percent and 0.18 percent of videos would contain content that violates YouTube’s rules. But, according to Mohan, when people search for news or information, “now get the best results in terms of quality, not sensationalism.”
YouTube is sending out an announcement to explain its strategy around disinformation. Previously, social networks Twitter and Facebook also did something similar. The issue of misinformation surrounding Covid-19 and vaccines has become so big that in July, US President Joe Biden indicated that Facebook and other platforms were “killing” people by circulating false information about vaccination against Covid. He came back to that later.
Neal Mohan also touched on another accusation that platforms like his often face, which revolves around the business model. ‘I sometimes get the question of whether we leave provocative content because we benefit financially from it. Not only does this type of content not perform well on YouTube — especially when compared to music or humour — but it erodes the trust of audiences and advertisers,” he said.
However, YouTube’s chief product officer admitted that detecting deceptive content isn’t always easy. “For information about Covid, we rely on the consensus of experts and health organizations (…). But in most other cases, disinformation is harder to assess,” Mohan said.