YouTube has almost completely suppressed certain conspiracy theories from being recommended to viewers on its platform, according to new research, but others are slipping past its filters.
Misleading videos on topics including the flat Earth conspiracy theory and the 9/11 terror attacks had previously been amplified by the company’s popularity-based recommendation algorithms.
The company announced efforts to change the recommendation algorithm last year in order to demote harmful conspiracy theories and prevent their producers from profiting from spreading them.
The move followed a Sky News investigation which found the service was directing people towards misinformation and conspiracy theories by placing these YouTube videos prominently in its search results.
Researchers at the University of California, Berkeley, tested whether these changes worked by developing an automated system to watch through a year’s worth of YouTube videos, as the algorithm recommended to them.
Almost 70% of all views on YouTube come from its recommendations section. The recommendation algorithms promote videos based on a number of factors which measure user engagement.
Conspiracy theories are particularly good at being selected because they feature “novel and provoking content” according to the researchers, which “tends to yield higher than average engagement”.
The researchers found that when it came to the specific types of conspiracy which YouTube had said it would tackle, it had managed to tackle them very well.
Relatively few videos were found promoting flat Earth conspiracy theories or alleging that the US government was responsible for the 9/11 attacks, content which Sky News found was being heavily promoted, alongside theories regarding 5G.
Videos on other common conspiracy theories – such as those surrounding the assassination of President John F Kennedy and the Sandy Hook shooting – are also now rarely promoted.
But the researchers say that although “highly publicised topics fall under closer scrutiny… other conspiracies are still regularly recommended.”
This includes conspiracies surrounding climate change and aliens, the report said.
The impact of these recommendations cannot be understated, the researchers argue.
“With two billion monthly active users on YouTube, the design of the recommendation algorithm has more impact on the flow of information than the editorial boards of traditional media,” the paper stated.
They noted the role of its search engine is “even more crucial” in the light of three specific aspects of the platform.
Firstly, it is increasingly used as a primary source of information, especially for young people, meaning the impact it has on their education is more deserving of scrutiny now than ever before.
Secondly, it has a practical monopoly on the web video market. There are no competitors who could conceivably force YouTube to change by outperforming it.
And the third most important aspect to consider was what the team described as the “ever-growing weaponisation of YouTube to spread disinformation and partisan content around the world”.
Despite the importance of YouTube to modern society – which far exceeds that of most traditional, regulated media organisations – “the decisions made by the recommendation engine are largely unsupervised and opaque to the public” the scientists add.