YouTube Has A Massive Video Recommendation Problem?
YouTube Has A Massive Video Recommendation Problem?
Study shows that YouTube is recommending videos that often violate the platform’s own rules regarding disturbing and hateful content as well as misinformation.

YouTube’s content recommendation algorithms are under fire, for as it turns out, for often violating the platform’s own rules regarding disturbing and hateful content as well as misinformation. This comes as Mozilla Foundation releases a study which also suggests that YouTube users in non-English speaking countries are far more likely to be recommended videos that may otherwise be flagged as disturbing. In fact, Mozilla also says that as many as 189 videos that were recommended by YouTube to users who were part of the study, have since been removed from the platform for violating policies. Unfortunately, these videos had clocked as many as 160 million views before they were removed. YouTube, in a statement issued to NBC News, has said that in the past year, more than 30 changes have been implemented to reduce recommendations of harmful videos.

The Mozilla report comes after a 10-month long investigation that was crowdsourced and required volunteers to report data using an open-source web browser extension called RegretsReporter. The researchers say this is the largest-ever crowdsourced investigation into YouTube’s algorithms. The videos that were reported back include misinformation and fearmongering about COVID-19, political misinformation and animations categorized as children’s cartoons but wildly inappropriate for the supposed audience. Mozilla Foundation says that 71% of all videos that volunteers reported back were recommended by YouTube. The non-English speaking countries are impacted more, with the reported rate of regrettable videos as much as 60% higher in those regions.

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,” says Brandi Geurkink, Mozilla’s Senior Manager of Advocacy. “Our research confirms that YouTube not only hosts, but actively recommends videos that violate its very own policies,” adds Geurkink. There are examples and illustrations that do indicate there is a massive problem that YouTube has to deal with. One user reported having watched a video about the U.S. military and YouTube recommended a video titled “Man humilitates feminist in viral video”. Another person watched a video about software rights and was then recommended a video about gun rights. According to the data, the regret rates per 10000 videos is the highest in non-English speaking countries—Brazil leads the way while India figures in the top 10 countries.

In 2019, YouTube had said that they are continuing to improve recommendations on the platform. “When recommendations are at their best, they help users find a new song to fall in love with, discover their next favorite creator, or learn that great paella recipe. That’s why we update our recommendations system all the time—we want to make sure we’re suggesting videos that people actually want to watch,” they had said. Even for the videos that were taken down after they had clocked more than 160 million videos as in the case of the 189 specific videos that Mozilla’s report is talking about, there often aren’t explanations as to why a video is not available. “In nearly 40% of the cases we analyzed, videos were simply labeled “video unavailable,” without identifying the reason why they were taken down,” researchers say. In other cases, there was a mention of community Guidelines violations, copyright infringement, hate speech laws, or that the video was deleted or made private by the uploader.

Read all the Latest News, Breaking News and Coronavirus News here.

What's your reaction?

Comments

https://hapka.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!