BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

YouTube Outlines Plans To Limit Spread Of Misinformation

Following
This article is more than 2 years old.

YouTube has announced new plans for dealing with misinformation on its platform, including ways of handling 'borderline' content.

As new conspiracy theories have popped up during the Covid crisis, the company has struggled to stay on top of the tide.

"Increasingly, a completely new narrative can quickly crop up and gain views," says chief product officer Neal Mohan.

"Or, narratives can slide from one topic to another — for example, some general wellness content can lead to vaccine hesitancy. Each narrative can also look and propagate differently, and at times, even be hyperlocal."

The company will now, he says, ramp up its efforts to train its machine learning systems with a more tightly targeted mix of classifiers, adding keywords in additional languages and information from regional analysts, to try and catch misinformation that's been missed so far.

Meanwhile, it's considering making it harder to share borderline content.

"Even if we aren’t recommending a certain borderline video, it may still get views through other websites that link to or embed a YouTube video," says Mohan.

As a result, the company's looking at disabling the share button or breaking the link on videos that it's already limiting in recommendations, making it impossible to embed or link to a borderline video on another site.

"But we grapple with whether preventing shares may go too far in restricting a viewer’s freedoms," says Mohan.

"Another approach could be to surface an interstitial that appears before a viewer can watch a borderline embedded or linked video, letting them know the content may contain misinformation."

Misinformation is often a local affair, with, for example, conspiracy theories circulating in Brazil during the Zika virus outbreak. To try and handle this, the company says it will expand regional teams and consider partnerships with experts and non-governmental organizations around the world. It's also working on ways to update models more often.

All these measures are, apparently, simply under consideration for now, with Mohan explaining: "We’ll continue to carefully explore different options to make sure we limit the spread of harmful misinformation across the internet."

YouTube has consistently been criticized for hosting misinformation, and for doing too little to combat its spread. Just last month, indeed, a group of 80 fact-checking organizations signed an open letter claiming that "YouTube is allowing its platform to be weaponized by unscrupulous actors to manipulate and exploit others, and to organize and fundraise themselves."

Follow me on Twitter