YouTube announced an update regarding YouTube recommendations last week. As per the new update, YouTube aims to reduce the recommendations of videos that promote misinformation ( eg; conspiracy videos, false claims about historical events, flat earth videos, etc) that affect users in harmful ways, to better the user experience on the platform.
YouTube states that the new change is going to be gradual and will be applicable for less than 1% of the overall videos on YouTube as of now. “To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube. As always, people can still access all videos that comply with our Community Guidelines”, states the YouTube team.
YouTube is also working on eliminating the presence of content that “comes close” to violating its community guidelines. The new change makes use of machine learning along with human evaluators and experts from all over the United States to train these machine learning systems responsible for generating recommendations. Evaluators are trained using public guidelines and help offer their input on the quality of a video. Currently, the change is applied only to a small set of videos in the US as the machine learning systems are not very accurate currently. The new update will roll out in different countries once the systems become more efficient.
YouTube is continually updating its system to improve the user experience on its platform. For instance, YouTube has taken steps against clickbait content in the past and keeps updating its system to put more focus on viewer satisfaction instead of views, while also making sure to not recommend clickbait videos as often. YouTube team also mentions that Youtube now presents recommendations from a wider set of topics (instead of many similar recommendations) to its users and hundreds of changes were made to optimize the quality of recommendations for users.
“It’s just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube. We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users”, states the YouTube team.
Public reaction to this news is varied, with some calling YouTube’s new move as ‘censorship’ while others appreciating it:
Not okay at all. Who determines what to label as a conspiracy? This is censorship. Again. Of course, they have all right to do so. But if this continues, it might be time to look at alternatives.
— Purresnol (@Purresnol) January 26, 2019
…I get it, but who decides what is misinformation and how does that affect freedom of speech? And will it be different in different countries based on their respective ‘values’
— Tracy Green (@therealTTG) January 25, 2019
It's about time because I'm tired of all those Trump ads popping up on my screen.
— NomoBS (@nomo_BS) January 26, 2019
Why are conspiracy theories even allowed anywhere? That's literally fake news!
— Matthew Wheeler (@Mattlennial) January 26, 2019
Good! Social media platforms have a responsibility to keep misinformation from spreading rampantly
— SkyPolitic (@politic_sky) January 26, 2019