YouTube will continue to recommend videos of the same type to people who have marked such a video with a “dislike”. Mozilla researched the thumbs down and came to that conclusion.
Large scaled research
Mozilla used data from more than 20,000 YouTube users for its research to see if buttons such as ‘not interested’, ‘dislike’, ‘stop recommending this channel’ and ‘delete from watch history’ make sense. to have. Not so. You will still constantly see videos of the same caliber. At best, that’s half the number, but still a lot of videos get through the algorithm. Mozilla worked with a huge amount of data: it involved more than 500 million recommended videos.
It uses its RegretsReporter to generate this data. This is an extension where you get a kind of stop button for YouTube and at the back signaled to YouTube that one of the above options. But guess what: none of those options will get you rid of those types of videos. However, it appears that ‘stop recommending this channel’ and ‘remove from viewing history’ were slightly more effective.
No respect for its users
Mozilla believes that the researchers have no respect for user feedback or the time users spend on the platform. YouTube, for its part, says the research doesn’t take into account how YouTube works at all. A remarkable statement, since in principle it is not relevant: pressing such a button should help and it hardly does. YouTube doesn’t fall for that, though, but more on what Mozilla labels as a similar video. According to Mozilla, something is a similar video that is not seen as a similar video at all on the video platform.
Moreover, YouTube does not immediately want to exclude an entire topic for you, the spokesperson said. “Importantly, our controls don’t filter out entire topics or points of view, as this could have negative effects for viewers, such as creating echo chambers. We welcome academic research to our platform, so we recently expanded Data API access through our YouTube Researcher Program, however, Mozilla’s report doesn’t take into account how our systems actually work, so it’s difficult for us to gain a lot of insights.”