Generative AI at work
5 min read

Yesterday Facebook rolled out a policy to ban white nationalist content from its platforms. This seems to be a significant step towards the longstanding demands from civil rights groups who said the tech giant was failing to confront the powerful reach of white extremism on social media.

The threat posed by social media enabling white nationalism was violently underlined this month when a racist gunman killed 50 people at two mosques in New Zealand, using Facebook and other social media platforms to post live video of the attack. Facebook removed the video and the gunman’s account soon after but the footage was already widely shared on Facebook, YouTube, Twitter, Reddit and 8chan website.

In a blog post titled “Standing Against Hate,” that Facebook posted on Wednesday, the company said the ban takes effect next week. As of midday Wednesday, the feature did not yet appear to be live, based on searches for terms like “white nationalist,” “white nationalist groups,” and “blood and soil.”

As part of its policy change, Facebook said it would divert users who searched for white supremacist content to Life After Hate, a nonprofit that helps people leave hate groups, and would improve its ability to use artificial intelligence and machine learning to combat white nationalism.

Based on information in Motherboard’s report, the platform will use content-matching to delete images previously flagged as hate speech. There was no further elaboration on how that would work, including whether or not URLs to websites like 4chan and 8chan would be affected by the ban.

Facebook will not differentiate between white nationalism, white separatism and white supremacy

The company had previously banned white supremacist content from its platforms but maintained a murky distinction between white supremacy, white nationalism and white separatism. On Wednesday, it said that its views had been changed by civil society groups and experts in race relations and that it now believed “white nationalism and separatism cannot be meaningfully separated from white supremacy and organized hate groups.”

Generative AI at work

Kristen Clarke, the president of the Lawyers’ Committee for Civil Rights Under Law, which helped Facebook shape its new attitude toward white nationalism, said the earlier policy “left a gaping hole in terms of what it provided for white supremacists to fully pursue their platform.”

“Online hate must be confronted if we are going to make meaningful progress in the fight against hate, so this is a really significant victory,” Ms. Clarke said.

“It’s clear that these concepts are deeply linked to organized hate groups and have no place on our services,” Facebook said in a statement posted online. It later added, “Going forward, while people will still be able to demonstrate pride in their ethnic heritage, we will not tolerate praise or support for white nationalism and separatism.”

“Our policies have long prohibited hateful treatment of people based on characteristics such as race, ethnicity or religion — and that has always included white supremacy,” the company said in a statement. “We didn’t originally apply the same rationale to expressions of white nationalism and separatism because we were thinking about broader concepts of nationalism and separatism — things like American pride and Basque separatism, which are an important part of people’s identity.”

The civil rights groups welcome this ban but wait for implementation before approving Facebook’s move

Facebook’s decision was praised by civil rights groups and experts in the study of extremism, many of whom had strongly disapproved of the company’s previous understanding of white nationalism.

Madihha Ahussain, a lawyer for Muslim Advocates, a civil-rights group, said the policy change was “a welcome development” in the wake of the New Zealand mosque shootings. But she said the company still had to explain how it will enforce the policy, including how it will determine what constitutes white nationalist content.

“We need to know how Facebook will define white nationalist and white separatist content,” she said. “For example, will it include expressions of anti-Muslim, anti-Black, anti-Jewish, anti-immigrant and anti-LGBTQ sentiment — all underlying foundations of white nationalism? Further, if the policy lacks robust, informed and assertive enforcement, it will continue to leave vulnerable communities at the mercy of hate groups.”

Mark Pitcavage, who tracks domestic extremism for the Anti-Defamation League, said the shift from Facebook was “a good thing if they were using such a narrow definition before.”

Mr. Pitcavage said the term white nationalism “had always been used as a euphemism for white supremacy, and today it is still used as a euphemism for white supremacy.” He called the two terms “identically extreme.”

He said white supremacists began using the term “white nationalist” after the civil rights movement of the 1960s, when the term “white supremacy” began to receive sustained scorn from mainstream society, including among white people.

“The less hard-core white supremacists stopped using any term for themselves, but the more hard-core white supremacists started using ‘white nationalism’ as a euphemism for ‘white supremacy,’” he said.

And he said comparisons between white nationalism and American patriotism or ethnic pride were misplaced.

“Whiteness is not an ethnicity, it is a skin color,” Mr. Pitcavage said. “And America is a multicultural society. White nationalism is simply a form of white supremacy. It is an ideology centered on hate.”

Progressive nonprofit civil rights advocacy group, Color of Change called Facebook’s new moderation policy a critical step forward.

“Color Of Change alerted Facebook years ago to the growing dangers of white nationalists on its platform, and today, we are glad to see the company’s leadership take this critical step forward in updating its policy on white nationalism,” the statement reads. “We look forward to continuing our work with Facebook to ensure that the platform’s content moderation guidelines and training properly support the updated policy and are informed by civil rights and racial justice organizations.”

Read Next

How social media enabled and amplified the Christchurch terrorist attack

Google and Facebook working hard to clean image after the media backlash from the Christchurch terrorist attack

Facebook under criminal investigations for data sharing deals: NYT report

Generative AI at work
Being a Senior Content Marketing Editor at Packt Publishing, I handle vast array of content in the tech space ranging from Data science, Web development, Programming, Cloud & Networking, IoT, Security and Game development. With prior experience and understanding of Marketing I aspire to grow leaps and bounds in the Content & Digital Marketing field. On the personal front I am an ambivert and love to read inspiring articles and books on life and in general.