YouTube’s recommendation algorithm can lead you down some very weird rabbit holes, suggesting videos that feel strangely personal and off-target at the same time. Today, Mozilla will introduce a new browser extension, RegretsReporter, that aims to Crowdsource research on users’ “regrettable recommendations” to let users better understand how YouTube’s recommendation algorithm works, and provide details on patterns it detects.
Mozilla began collecting stories from users last year about the videos that YouTube recommended to them; a user searched for videos about vikings and was recommended content about white supremacy; another searched for “fail”
But there has not really been a large-scale, independent effort to track YouTube’s recommendation algorithm to understand how it determines which videos to recommend, said Ashley Boyd, Mozilla’s vice president of advocacy and engagement.
“So much attention goes to Facebook – and deservedly – when it comes to misinformation,” Boyd said. “But there are other elements of the digital ecosystem that have been monitored, and YouTube was one of them. We started looking at what YouTube said, how they curated content and noticed that they responded to concerns about the algorithm and said they were making progress. But there was no way to confirm their claims. ”
A YouTube spokesman said in a statement to The edge that the company is always interested in seeing research on its recommendation system. “HThatbut it’s hard to draw broad conclusions from anecdotal examples, and we are constantly updating our recommendation systems to improve the user experience, ”the spokesman said, adding that over the past year YouTube has launched“ over 30 different changes to reduce frontier content recommendations . ”
The Google-owned video platform has on several occasions promised to fine-tune the algorithm, Boyd points out, even though business executives were aware that it recommended videos containing hate speech and conspiracy theories.
The browser extension sends data to Mozilla about how often you use YouTube, but without collecting information about what you are searching for or watching unless you specifically offer it. You can submit a report via the extension to provide more details about any “regrettable” video you encounter in the recommendations, which will allow Mozilla to collect information about the video you are reporting and how you got there.
Mozilla hopes the extension will make the “how” of YouTube’s recommendation algorithm more transparent; what type of recommended videos, e.g. leads to racist, violent or conspiratorial content, and identifies any patterns of how often harmful content is recommended.
“I would love for people to become more interested in how AI and in this case recommendation systems affect their lives,” Boyd said. “It does not have to be mysterious, and we can be clearer about how you can control it.”
Boyd stressed that users’ privacy is protected throughout the process. The data that Mozilla collects from the extension will be linked to a randomly generated user ID, not to a user’s YouTube account, and only Mozilla will have access to the raw data. It does not collect data in private browser windows, and when Mozilla shares the results of its research, it will do so in a way that minimizes the risk of users being identified, Boyd said.
Mozilla does not have a formal arrangement with Google or YouTube for its research into the recommendation algorithm, but Boyd says they have been in communication with the company and are required to share information.
However, YouTube said the method proposed by Mozilla seemed “questionable” and added that it could not properly review how “regrettable” is defined, among other things.
Mozilla plans to spend six months collecting information from the expansion, after which it will present its results to users and to YouTube. “We believe they are committed to this issue,” Boyd said of YouTube. “We would love it if they could learn something more from our research and make some viable changes to work towards building more reliable content recommendation systems.”