Breaking News

YouTube's 'Dislike' and 'Not Interesting' buttons barely work, study finds

YouTube's 'Dislike' and 'Not Interesting' buttons barely work, study finds




Even when users tell YouTube they're not interested in certain types of videos, similar recommendations keep coming in, a new study from Mozilla has found.

Using video recommendation data from more than 20,000 YouTube users, Mozilla researchers found that buttons such as "Not interested", "Dislike", "Stop recommending channel" and "Remove from viewing history" similar content. can display. are largely ineffective in preventing them from being recommended. According to the report, even at their best, these buttons still allow over half of the recommendations, as one user said they weren't interested in. In the worst case, the buttons barely made a dent in blocking similar videos.

To collect actual videos and data from users, Mozilla researchers enlisted volunteers who used the foundation's Regrets Reporter, a browser extension that overlays a generic "stop recommend" button for YouTube videos watched by participants. Is. Is. On the back end, users were randomly assigned a group, so each time they clicked a button placed by Mozilla, different signals were sent to YouTube — dislike, not interested, recommending channels. do do not. Remove from history, and a control group for which no response was sent to the forum.

Using data collected from more than 500 million recommended videos, research assistants created more than 44,000 pairs of videos—one "disapproved" video, as well as a video later recommended by YouTube. The researchers then assessed the pairs themselves or used machine learning to decide whether the recommendation was similar to the video the user had declined.

Sending "dislike" and "not interested" signals was only "moderately effective" in blocking 12 percent of the 11 percent poor recommendations, respectively, compared to the baseline control group. The "don't recommend channel" and "remove from history" buttons were slightly more effective — they prevented 43 percent and 29 percent of bad recommendations — but researchers say the tools offered by the platform still help remove unwanted content. We do. are insufficient.

"YouTube should respect the feedback users share about their experience, treating them as meaningful clues about how people want to spend their time on the platform," the researchers wrote.

YouTube spokeswoman Elena Hernandez says these behaviors are intentional because the platform does not attempt to block all content related to a topic. But Hernandez criticized the report, saying it doesn't take into account how YouTube's controls are designed.

“The important thing is that our controls do not filter out entire subjects or perspectives, as this can have negative effects on the audience, such as creating an echo chamber,” Hernandez told The Verge. “We welcome academic research to our platform, which is why we recently expanded Data API access through our YouTube Researcher Program. Mozilla's report doesn't take into account how our systems actually work, and so it's difficult for us to get a lot of information."

Hernandez says Mozilla's definition of "similar" fails to consider how YouTube's recommendation system works. The "Not Interested" option removes a specific video, and the "Don't Recommend Channel" button prevents the channel from being recommended in the future, Hernandez says. The company says it does not want to withhold all content recommendations related to a topic, opinion or speaker.

Apart from YouTube, other platforms like TikTok and Instagram have introduced more and more feedback tools to train algorithms to show relevant content to users. But users often complain that even after flagging something they don't want to see, similar recommendations remain. Mozilla researcher Becca Ricks says it's not always clear what the various controls actually do, and the platforms aren't transparent about how feedback is taken into account.

“I think in the case of YouTube, the platform is balancing user engagement with user satisfaction, which is ultimately a tradeoff between recommending content that drives people to spend more time on the site and content . algorithms think people will like," Ricks told The Verge via email. "The platform has the power to change which of these signals receives the most weight in its algorithms, but our study shows that user feedback may not always be the most important."

No comments