YouTube intends to ramp up its efforts to combat conspiracy mongers, perhaps in response to the rash of conspiracy videos that trended following the school shooting in Parkland, Florida, last month.
Among other things, YouTube will supply links to relevant Wikipedia pages and other credible websites to provide viewers with a counter narrative, according to CEO Susan Wojcicki, who revealed the plans earlier this week during a panel discussion at SXSW.
YouTube plans to roll out additional features pointing to third-party information sources over the next few months.
Wikimedia Foundation welcomed the move, but noted that it had not entered a formal partnership with YouTube and had not received any advance notice of the plan.
“We are always happy to see people, companies and organizations recognize Wikipedia’s value as a repository of free knowledge,” said Samantha Lien, spokesperson for the Wikimedia Foundation.
Wikipedia is freely licensed for reuse by anyone, and its mission is to facilitate sharing of that content, she told TechNewsWorld.
Wikipedia is based on the contributions of hundreds of thousands of volunteer contributors. She added that they encourage others who share Wikipedia’s content to give back.
Thousands of videos have been uploaded to YouTube by conspiracy theorists, noted John Paolillo, associate professor of informatics at
They share some common threads, he told TechNewsWorld, as many of them come from survivalists, gun rights activists, InfoWars, the Russian propaganda channel RT, and libertarian commentators.
“Conspiracy theory videos are posted and reposted and seem almost immune to disappearing,” Paolillo remarked. “There are thousands upon thousands of these, and reliably identifying them is not that simple.
It’s likely that YouTube will face a severe backlash from certain users who may see the crackdown on these sites as a conspiracy.
Like Facebook, YouTube faces an enormous challenge in trying to weed out fake news, conspiracy videos, and other types of hate speech or misinformation, observed Rick Edmonds, media business analyst at Poynter.
While some Wikipedia entries may not be accurate, there is a method in place for policing information and promptly updating problematic posts, he told TechNewsWorld.
YouTube is one of the key enablers of “micro-propaganda,” noted Jonathan Albright, research director at the Tow Center for digital journalism, in a recent
post on Medium.
YouTube was inundated with conspiracy theories following the Parkland shooting, suggesting that the incident had been faked and that survivors who spoke out after the shooting were so-called “crisis actors.”
A data set of more than 250 videos were returned from a search of “crisis actor,” Albright noted. In that data set, 20 percent of the videos were related to mass shootings, false flags and crisis actors. The other 80 percent were related to historical, religious or government conspiracies.
It was imperative that YouTube take additional steps, including optional filters and human monitors, to monitor its pages for this kind of disinformation, Albright wrote.
Conspiracy videos are part of a wider trend in social media. Fake news, hate speech, hoaxes and other misinformation have been proliferating on Facebook, Twitter, YouTube and Google.
After taking a beating over the proliferation of fake news during the 2016 presidential campaign, Facebook recently announced that it would de-emphasize news coverage in Trending Topics in favor of more posts from friends and family members.
CEO Mark Zuckerberg has come under severe criticism for being too slow to recognize Facebook’s increasing role and responsibility as a digital publisher, while established media outlets have suffered mightily. The lion’s share of digital advertising has been gobbled up by Facebook and Google, leading to severe economic distress for the journalism industry.