Facebook has banned Britain First and its two leaders after months of pressure on the platform to tackle right-wing extremism. In a blog post released after the ban, Facebook argued that the official Britain First page, and the personal pages of leaders Paul Golding and Jayda Fransen, had “repeatedly broken our Community Standards” and ignored written warnings.
In an emailed statement, Facebook specified a handful of posts that violated its guidelines, prompting the ban. These included a photo of Britain First leaders with the caption “Islamaphobic and Proud,” a post comparing Muslim immigrants with animals and multiple videos posted deliberately to incite hateful comments against Muslims.
But the content and tone of these posts is nothing new. Britain First has a long history of using its Facebook page to openly incite hatred against Muslims. Today’s ban could easily have taken place at any point over the last few weeks, months or years. Yet Britain First's Facebook strategy has been consistent for years: incite hatred, target minorities and gradually expose people to extreme, right-wing views. It also isn't the only page on Facebook using such tactics to spread extreme views.
Holding Facebook to account on issues such as this is nigh-on impossible. Once the social network bans a page, nearly all record of its existence is wiped from the public domain. The real question? How different was the Britain First page yesterday versus one year ago.
So what’s changed? Well, for a start, Britain First’s public image had already fallen so far that defending its presence on Facebook was starting to look untenable. In November, Britain First attracted widespread condemnation, including from prime minister Theresa May after US president Donald Trump retweeted anti-Muslim videos posted by the deputy leader Jayda Fransen. In December, Twitter shut down Golding and Fransen’s accounts as part of the social media company’s crackdown on far-right hate speech, while YouTube placed tighter restrictions on Britain First’s video channel.
Read more: How religious extremists gamed Facebook to target millions of Britons with far-right propaganda
It also emerged that Darren Osborne, the man found guilty of murder and attempted murder after deliberately driving a van into a crowd of Muslims outside a London mosque in June 2017, had researched Britain First online and exchanged Twitter direct messages with Fransen in the weeks before his attack. Last week Golding and Fransen were jailed after being found guilty of religiously-aggravated harassment. The pair are also facing further charges in Belfast, although that case has been delayed until they have served their current prison terms.
As all this went on, pressure has been mounting on Facebook to do something about far-right hate speech on its platform. In February this year, a WIRED investigation found that far-right extremists with links to European paramilitary groups were using Facebook to target millions of users with propaganda. In December, the Home Affairs Select Committee accused Facebook of not doing enough to remove hate speech from pages. Over the last year, the tide of popular opinion has turned against the company and its culture of complacency when it comes to hate speech, false news and possible Russian meddling in the 2016 US presidential election.
It was only this combination of external events that gave Facebook the impetus to act. Today’s ban is the right decision, but it is not solely about tackling hate speech. It is about Facebook being seen to tackle hate speech.
Facebook's community standards, with their vague definition of hate speech, conveniently enable this flexible approach to political extremism. Last week, Stephen Lennon, the former leader of the far-right English Defence League who posts under the pseudonym Tommy Robinson, shared a video of him punching a so-called ‘migrant’ in the face. He has almost 690,000 likes on Facebook. A cluster of pages with links to the far-right activist and Britain First founder Jim Dowson churns out white nationalist and anti-Muslim memes to an audience of up to 2.4 million. Facebook insists that none of these pages violate its community guidelines and so, for now, they remain active.
So where does Facebook draw the line? "Keep calm, protect western values, protect western values [...] and exterminate terrorists and their supporters," reads one recent post shared 220 times from the Facebook page British Freedom. "Enoch was Right!" read another recent post, shared 375 times.
When WIRED asked Facebook about pages linked to Dowson, a spokesperson for the social network said that while it was keeping a close eye on them, they didn’t violate its community guidelines. In the case of Britain First, it didn't suddenly become a place for dangerous, fringe views. It was founded on those principles. It was only when it became too loud to ignore that Facebook finally took action. But the dangerous views being shared on Facebook continue, just out of sight. At noon today, the page English and Proud posted: "It's time to step on some toes and take our country back," linking to the membership page of Knights Templar International – an organisation that asks its members to join a "battle against Islam" and which funds anti-migration European border patrols.
If today's ban was the start of Facebook getting tough on extremism, then the English and Proud page, which has 330,000 likes, would be an ideal place to continue that crackdown.
Updated March 28, 2018: The article originally incorrectly stated that the border patrols with links to Knights Templar International were illegal. This reference has now been removed.
This article was originally published by WIRED UK