White nationalism and white separatism were banned from Facebook in March. So why are white nationalist groups still operating on Facebook?
The social media giant was supposed to enact its ban the first week in May.
A Facebook spokesperson told the Daily Dot that the platform removed several groups that were violating some of its community standards–including its standards on “dangerous individuals and organizations” and “misrepresentation.”
This means the groups removed were involved in terrorist activity, organized hate, mass or serial murder, human trafficking, organized violence or criminal activity, as well as a misrepresentation of Facebook identity.
“We proactively look for bad actors, and investigate concerns when they are raised. We are continually reviewing activity on our platform for potential violation of our policies and will take action in line with our Community Standards,” the spokesperson said.
Megan Squire, an online extremism researcher, pointed out the continued presence of these groups in a tweet.
This KKK group is still posting and planning events on Youtube, Facebook, and Twitter. Reports aren't taken seriously (see screens in comments). Now the latest event is being called a "powder keg". What is the role of the social media companies in letting groups like this fester? https://t.co/WLwehgyesX— megan squire (@MeganSquire0) May 23, 2019
After these white supremacist groups were banned from the platform, some changed their name and went back to operate under a new guise.
Squire told the Daily Dot that the Proud Boys altered its name so it could continue operating on the platform. One of the new accounts is called “PB Canada,” another is called “West is the Best II: Electric Boogaloo,” according to the researcher.
Update 9:30am CT, June 1: Squire said that both the new Proud Boy pages are now gone from Facebook.
Another group banned was Soldiers of Odin. A member of that group, who wasn’t banned, rebranded their individual Facebook account to “S.O.O.Recruiting.Sudbury” to continue operations for the group.
Another group called “Sons of Odin” is also operating on the platform.
Another tactic used by the white nationalist groups is to create secret groups to evade detection, according to the researcher.
“(The groups) create brand pages and then using those to create groups, instead of creating groups using their personal account. This helps them evade detection as the true group owner,” Squire said.
Squire said these groups are still recruiting new members, spreading extremism, as well as organizing events and rallies on the platform.
For example, a Twitter user posted screenshots of a Facebook user who was organizing for the recent KKK rally in Dayton, Ohio, through Facebook.
5/ So a few weeks ago, Derek Eaglin put out a Facebook post, asking people to confirm if they were coming, since they'd need to give the police a headcount, and as Katy Eaglin notes, how many vehicles they'd need the cops to block off street parking for. pic.twitter.com/2xEiIr5Rdr— AntiFash Gordon (@AntiFashGordon) May 23, 2019
When enforcing its white supremacy ban, Facebook looks for explicit, rather than implicit, statements–which is why it is not always possible for Facebook to analyze content, the spokesperson said.
“What concerns me is that (Facebook doesn’t) try very hard to keep banned pages or banned groups from re-emerging on the platform under a slightly different name,” Squire said. “They don’t seem to be engaging experts who are aware of how to track this.”
The spokesperson also admitted that Facebook’s efforts to remove white supremacist material isn’t perfect.
Facebook’s policy team has been working to better understand hate symbols and slogans affiliated with white nationalism and separatism so that it can recognize bad actors and take appropriate action, according to the spokesperson.
Squire said that while it’s a good practice to understand these slogans and symbols, Facebook needs to be doing more, such as developing a team of white supremacy experts to help.
“It’s not a matter of memorizing slogans and symbols,” Squire said. “Understanding hate in 2019 means deep attention to memes and language that are changing on a near-daily basis. It is very time-consuming to stay current in the ‘hate scene’ as it exists today.”
It was also reported on Wednesday that Twitter is now researching whether white supremacy should be banned from its own platform.
“Individuals and organizations who spread hate, attack, or call for the exclusion of others on the basis of who they are have no place on our services,” the Facebook spokesperson said.
- Right-leaning sites are dominating abortion coverage, spreading misinformation on Facebook
- Facebook is still letting border militias organize on its platform
- Facebook, Instagram ban numerous far-right figures
- It’s 2019, and Facebook is finally banning white nationalism
H/T BuzzFeed News