It has not been long since the Logan Paul controversy hit the internet and people criticized YouTube algorithms and complained that they were still seeing recommendations of Logan Paul’s videos, even when it was brought down. Earlier this week, a “Flat Earth Conference” was held at Denver, Colorado where some attendees talked about how Youtube has persuaded them to believe the flat earth theory. In fact, Logan Paul was also one of the conference’s keynote speakers, despite not believing that the Earth is flat.
The attendees were interviewed by Daily Beast. In the conference, many participants told Daily Beast that they have come to believe in the Flat Earth theory based on YouTube videos. “It came on autoplay,” said Joshua Swift, a conference attendee. “So I didn’t actively search for Flat Earth. Even months before, I was listening to Alex Jones.”
Recently, NBA star Kyrie Irving also spoke about his obsession with flat earth theory blaming YouTube videos for it. Irving spoke of having wandered deep down a “rabbit hole” on YouTube.
This has brought the emphasis back on the recommendation system that YouTube uses. In a blog post, Guillaume Chaslot, and ex-googler who helped build the YouTube algorithm explains, “Flat Earth is not a ’small bug’. It reveals that there is a structural problem in Google’s AIs and they exploit weaknesses of the most vulnerable people, to make them believe the darnedest things.”
He mentions a list of Flat Earth videos which were promoted on Youtube.
This makes one question whether the YouTube algorithm is evil? The YouTube algorithm recommends videos based on watch time. More watch time means more revenue and more scope for targeted ads. What this changes, is the fundamental concept of choice and the exercising of user discretion. The moment the YouTube Algorithm considers watch time as the most important metric to recommend videos to you, less importance goes into the organic interactions on YouTube, which includes liking, commenting and subscribing to videos and channels.
Chaslot was fired by Google in 2013 over performance issues. His claim was that he wanted to bring about a change in the approach of the YouTube algorithm to make it more aligned with democratic values instead of being devoted to just increasing the watch time.
Chaslot has created Algotransparency, a site that scans and monitors YouTube recommendations daily.
Other Twitter users have also supported Chaslot’s article.
YouTube is tilting the world towards crazier and crazier beliefs in dozens if not hundreds of languages that their engineers don't even speak. Ex-YT recommendations engineer @gchaslot has endlessly raised awareness about it yet YT has barely taken steps to do anything about it. https://t.co/xcVirVZ4Wi
— Tristan Harris (@tristanharris) November 20, 2018
I've been obsessed with this subject for the last couple of months, I clicked on a video out of curiosity and my suggestions were flooded by flat earth content. It's a viral infection attacking science and reality, driven by recommendations and clicks
— Andres Guadamuz (@technollama) November 19, 2018
Let's not let @facebook off the hook, either. It's all part of the same ecosystem with the same toxic algorithmic amplification. Plus Groups!
— Siva Vaidhyanathan🗽🤘🏽 (@sivavaid) November 19, 2018
Read Next
Is YouTube’s AI Algorithm evil?
YouTube has a $25 million plan to counter fake news and misinformation