8 min read

On Wednesday YouTube announced that it will remove thousands of videos and channels advocating bigoted ideologies such as white supremacy and neo-Nazism. This move comes under a new policy to tackle hate and extremist views on the platform as YouTube faces intense scrutiny over its user standards. According to the policy it will ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion,” the company said in a blog post.

“It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence,” reads the blog.

The policy also covers videos pushing conspiracy theories that allege violent incidents, like mass shootings did not take place. For example, InfoWars host Alex Jones continuously pushed a conspiracy theory on YouTube that the mass shooting at Sandy Hook Elementary School in Connecticut was a hoax, and alleged that distraught families of the victims were crisis actors.

Last month Facebook and Instagram banned about six particularly toxic extremists accounts and one conspiracy theorist organization. While Twitter had permanently blocked InfoWars and Jones in September last year for violating its harassment policies.

The new anti-bigotry policy implementation by YouTube

“YouTube has always had rules of the road, including a longstanding policy against hate speech,” a company statement said.

“Today, we’re taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”

“We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we’ll be gradually expanding coverage over the next several months,” YouTube said.

In January, YouTube said it would stop recommending videos such as those claiming the earth is flat or promoting bogus theories about the September 11, 2001 terror attacks or the 2012 killings at the Sandy Hook elementary school in Connecticut. But it stopped short of banning such content.

YouTube said it would seek ways to keep some of the violent content to make it available to researchers.

The latest move is likely to eliminate numerous “channels” that use the platform for monetization.

“We have long standing advertiser-friendly guidelines that prohibit ads from running on videos that include hateful content and we enforce these rigorously,” the statement said.

“Channels that repeatedly brush up against our hate speech policies will be suspended from the YouTube Partner program, meaning they can’t run ads on their channel or use other monetization features.”

The new policy could be the result of continuous calls from lawmakers around the world to curb online hate

The move comes after a call by world leaders to curb extremism online, following revelations about the live streaming of the Christchurch attack in New Zealand which happened in March this year. Last month 8 tech companies including Google and 18 governments signed a non-binding agreement to curb online extremism.

The Southern Poverty Law Center, which tracks white supremacists and other extremist groups, says the ban will be positive only if YouTube enforces it.

“As with other outlets before it, YouTube’s decision to remove hateful content depends on its ability to enact and enforce policies and procedures that will prevent this content from becoming a global organizing tool for the radical right,” said the group’s intelligence director Heidi Beirich.

“Tech companies must proactively tackle the problem of hateful content that is easily found on their platforms before it leads to more hate-inspired violence.”

Such moves by the social media has prompted criticism among right-wing activists in the United States, and President Donald Trump has claimed that online platforms are seeking to suppress conservative voices. Last month, Trump had also launched a new tool where US citizens can complain about social media bias and share their story. It is also worth noting that the Trump administration did not sign non-binding agreement to curb online extremism.

Fallout of an algorithmic driven content moderation policy

YouTube with the new policy roll out tried to tackle hate by removing harmful content. But it seems like it ended up also removing content which did not promote hate, but studied them.. It did not distinguish between what is good or bad content and relied upon algorithms to judge and suspend accounts entirely. For instance, as a part of this ban a history professor who had uploaded clips on Nazi policy featuring propaganda speeches by Nazi leaders was banned from YouTube. He shared his disappointment on Twitter and said 15 years of material for History teacher community has ended abruptly.

Policy implementation is insufficient – Vox host Carlos Maza’s case in point

YouTube did not disclose the names of any groups or channels that may be banned. But according to Buzzfeed report on 4th June, Vox host Carlos Maza expressed frustration over YouTube for not adequately enforcing its harassment policies. Maza has been experiencing an ongoing racist and anti-gay harassment from a right-wing Youtube personality Steven Crowder on the platform for years. Crowder has approximately 4 million subscribers on YouTube and has uploaded more than 1000 videos till now.

Maza wrote a viral Twitter thread last week describing the harassment he is experiencing from Crowder and his followers.

Crowder has published a number videos mocking Maza, calling him a “lispy queer” and making other racist and anti-gay comments. Maza, who hosts the Vox show Strikethrough, said he and Vox have directly reached out to YouTube for the past two years “and have gotten no action at all from them.”

“Steven Crowder is not the problem. Alex Jones isn’t the problem. These individual actors are not the problem,” Maza said. “They are symptoms and the product of YouTube’s design, which is meant to reward the most inflammatory, bigoted, and engaging performers.”

“I think YouTube should enforce its policies, and if it has a policy against hate speech, and bullying and harassment, it should enforce those policies,” he said.

“They haven’t done anything so far and I’ll tell you right now there’s a 0.0% chance YouTube punishes Crowder at all. Nothing is going to happen because Crowder is good for engagement.”

Andrew Surabian, a Republican strategist and former White House aide, said the move suggests YouTube has caved in to pressure from activists.

“If that’s their new standard, will they now demonetize all the rap videos with homophobic slurs on their platform?” he said on Twitter.

Nilay Patel, the Editor in Chief of The Verge says, that it is weird that conservatives think anything said in relation to racism, sexism or homophobia is against conservative speech too.

Yesterday The Verge reported that  Crowder’s channel is still operational. And YouTube said later on Twitter it had suspended monetization of Crowder’s channel, barring him from getting YouTube ad revenues.

https://twitter.com/kevinroose/status/1136342955030130688

“We came to this decision because a pattern of egregious actions has harmed the broader community and is against our YouTube Partner Program policies,” the company said.

Public sentiments on the new policy

When YouTube tweeted that today’s announcement generated a lot of questions and confusion. We know it hasn’t been easy for everyone. Going forward, we’ll be taking a closer look at our own harassment policies, with the aim to update them.

There has been a surge in reactions from the audience on Twitter about this and most importantly how the company has changed its stance on the hate speech policy within two days of the announcement. Users are frustrated with the kind of response from YouTube and do not trust them on keeping the promises as per their words.

Critics argue that Youtube demonetizing Crowder’s channel does solve the problem of continued spreading of hate filled rhetoric on the channel given the size of the channel. They point out that such channels do not rely primarily on Youtube’s payments for their functioning – the real revenue stream comes from the merchandize sold to the audience on the channel.

Update on 24th June – Google warns its employees that Pride protests are against the company’s code of conduct

Yesterday reports from various sources and The Verge says that Google employees are allowed to peacefully protest YouTube or Google during the Pride parade — as long as they are not marching with Google in an official capacity. According to internal memos sent to employees, anyone who chooses to walk the parade as a representative of Google and voice any protest will be considered in violation of Google’s code of conduct.

The decision to stifle a would-be protest is frustrating to some employees, who see it as especially ironic given YouTube’s dedication to free speech. “YouTubers who use our platform and sometimes get significant revenue get to claim free speech to keep using our platform … but LGBT Googlers get no free speech to say that Google doesn’t represent us,” one tells The Verge. “That’s ironic at best, but hypocritical … specifically ironic trying to curb our speech on the 50th anniversary of the Stonewall march riots.”

Read Next

Being a Senior Content Marketing Editor at Packt Publishing, I handle vast array of content in the tech space ranging from Data science, Web development, Programming, Cloud & Networking, IoT, Security and Game development. With prior experience and understanding of Marketing I aspire to grow leaps and bounds in the Content & Digital Marketing field. On the personal front I am an ambivert and love to read inspiring articles and books on life and in general.