Is the YouTube algorithm’s promoting of #AlternativeFacts like Flat Earth having a real-world impact?

0
2072
2 min read

It has not been long since the Logan Paul controversy hit the internet and people criticized YouTube algorithms and complained that they were still seeing recommendations of Logan Paul’s videos, even when it was brought down. Earlier this week, a “Flat Earth Conference” was held at Denver, Colorado where some attendees talked about how Youtube has persuaded them to believe the flat earth theory. In fact, Logan Paul was also one of the conference’s keynote speakers, despite not believing that the Earth is flat.

The attendees were interviewed by Daily Beast. In the conference, many participants told Daily Beast that they have come to believe in the Flat Earth theory based on YouTube videos. “It came on autoplay,” said Joshua Swift, a conference attendee. “So I didn’t actively search for Flat Earth. Even months before, I was listening to Alex Jones.

Recently, NBA star Kyrie Irving also spoke about his obsession with flat earth theory blaming YouTube videos for it. Irving spoke of having wandered deep down a “rabbit hole” on YouTube.

This has brought the emphasis back on the recommendation system that YouTube uses. In a blog post, Guillaume Chaslot, and ex-googler who helped build the YouTube algorithm explains, “Flat Earth is not a ’small bug’. It reveals that there is a structural problem in Google’s AIs and they exploit weaknesses of the most vulnerable people, to make them believe the darnedest things.”


He mentions a list of Flat Earth videos which were promoted on Youtube.

 

This makes one question whether the YouTube algorithm is evil? The YouTube algorithm recommends videos based on watch time. More watch time means more revenue and more scope for targeted ads. What this changes, is the fundamental concept of choice and the exercising of user discretion. The moment the YouTube Algorithm considers watch time as the most important metric to recommend videos to you, less importance goes into the organic interactions on YouTube, which includes liking, commenting and subscribing to videos and channels.

Chaslot was fired by Google in 2013 over performance issues. His claim was that he wanted to bring about a change in the approach of the YouTube algorithm to make it more aligned with democratic values instead of being devoted to just increasing the watch time.
Chaslot has created Algotransparency, a site that scans and monitors YouTube recommendations daily.

Other Twitter users have also supported Chaslot’s article.

Read Next

Is YouTube’s AI Algorithm evil?

YouTube has a $25 million plan to counter fake news and misinformation

YouTube went down, Twitter flooded with deep questions, YouTube back and everyone is back to watching cat videos

Content Marketing Editor at Packt Hub. I blog about new and upcoming tech trends ranging from Data science, Web development, Programming, Cloud & Networking, IoT, Security and Game development.