A new study shows that YouTube’s plan to stop recommending conspiracy videos in its regular video feed is working.
Why This Matters
Decreasing the number of conspiracy videos automatically promoted and monetized to YouTube’s regular visitors can only help in the fight against false information and extremist ideologies.
Some background: Due to criticism over the promotion of conspiracy videos (miracle cures, the earth is flat, etc.), YouTube announced that it would crack down on such “borderline content” in January of 2019.
Where we are now: The researchers, from the University of California, Berkeley and the Mozilla Foundation, developed a system to classify whether a video is “conspiratorial,” and then emulated YouTube’s Watch-Next algorithm to filter through a year’s worth of what the algorithm would actively promote. Marc Faddoula, Guillaume Chaslotb, and Hany Farida found that there is, in fact, a reduction in the number of conspiracy-labeled videos actively recommended.
This is not solved: While the researchers are cautiously optimistic, they realize that the problem of radicalization through such videos is a larger issue. “Those with a history of watching conspiratorial content can certainly still experience YouTube as filter-bubble,” they wrote, “reinforced by personalized recommendations and channel subscriptions.”
The overall reduction of conspiratorial recommendations is an encouraging trend.
The bottom line: The researchers also note that the design of YouTube’s algorithm has even more impact on the flow of information than, say, an editorial board at a more traditional media outlet. Such a powerful tool, argue the authors of this study, should be subject to more transparency and public testing now and into the future.
Via: Gizmodo
Become an Expert on YouTube
Get the Latest Tech News Delivered Every Day