YouTube will start adding links to Wikipedia and other websites on videos that promote conspiracy theories.
The move comes after the site has come under fire for recommending divisive content and conspiracy videos.
Susan Wojcicki, chief executive officer of YouTube Inc., introduces the company's new television subscription service at the YouTube Space LA venue in Los Angeles, California, U.S., on Tuesday, Feb. 28, 2017.
Google-owned video platform YouTube is trying to combat the amount of misinformation sown by the site by linking to Wikipedia pages and other "fact-based" websites on videos that promote conspiracy theories. YouTube CEO Susan Wojcicki made the announcement at the South by Southwest Festival in Austin, Texas, on Tuesday.
The move comes after YouTube's recommendation engine and autocomplete feature have come under fire this year for pushing users towards conspiracy theories and other divisive content.
In the coming months, Wojcicki said, conspiracy videos will start including text boxes, dubbed "information cues," that link to third-party sources that debunk the hoaxes in question.
Wojcicki said that the new feature will only be used on conspiracies causing "significant debate" on YouTube, like those about chemtrails or the moon landing. After the school shooting in Parkland, Florida earlier this year, a video that theorized that one of the survivors was a crisis actor made it into YouTube's Trending section. Because that conspiracy took off in a matter of days, it isn't clear whether a Wikipedia page disputing that theory would even be available yet.
Also, because Wikipedia pages are crowd-sourced, a page for a given event may not necessarily be accurate.
Wojcicki was also asked about why it can place an outright ban on hateful content — like videos published by neo-Nazi groups — but not on videos that are untrue.
Wojcicki said that hatefulness is "more clear" than if something is true or false, and that YouTube doesn't want to be an arbiter of truth.
YouTube, like other tech giants Facebook and Twitter, has long made the distinction that it is not a media organization, and thus bears less responsibility for the content on its platform.
A YouTube spokesperson told CNBC that the information cues initiative is part of broader efforts to tackle the proliferation of misinformation on the site.
Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://www.youtube.com/watch?v=bqIOJXCuzcM
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit