How Telegram became a safe haven for pro-terror Nazis

Several dozen groups are disseminating white supremacist propaganda and videos of lynches and shootings

Totally encrypted and largely unmonitored, the messaging app Telegram was created to provide uncensored communication between citizens of autocratic countries. Unfortunately, while it counts hundreds of millions of users, the platform has grown most infamous as a safe-haven for terrorists.

Now, a new report from the political action group Hope not Hate has found that the platform is playing host to several dozen Nazi channels.

These public and private chat groups, which post predominantly in English or Ukrainian and are predominately US-based, with a handful of UK groups, dub themselves the “Terrorgram”. The groups are highly interconnected, often reposting content from each other’s channels.

They draw influence from existing far right terror groups like the Atomwaffen Division, the now defunct Nazi webforum Iron March, and the writings of American Neo-Nazi James Mason. The groups disseminate white supremacist propaganda, videos of lynches and shootings, survivalist and guerrilla training manuals, and instructions for manufacturing weapons, carrying out attacks and evading detection.

The groups also canonise other famous terrorists as “saints”. Murderers who have received this designation include David Copeland, the 1999 London nail bomber, Anders Breivik the perpetrator of the 2011 Utoya attack in Norway, as well as unexpected choices like the Islamist terrorist Omar Mateen.

These posts often follow a particular aesthetic, known as “terrorwave”. This appears as, in the reports words, “red, white and black, the style often incorporates images of historical fascists, terrorists or paramilitaries wearing skull masks, alongside esoteric far-right symbols and simplistic slogans (such as “TRAITORS WILL HANG” and “RAPE THE POLICE”).”

The report’s author, David Lawrence, explains that though Telegram has long been used by the far right to communicate, there has been a noticeable surge in the channels and their numbers of users since the Christchurch massacre, on March 15, 2019. The report points to the Site Intelligence Group, which found that 80 per cent of a select sample of 374 far-right Telegram channels and groups were created between the March 15 massacre and October 30, 2019.

The number of users in this community likewise increased, according to Site data published in the Washington Post: a sample of far-right channels created in May 2019 collectively increased their memberships by 117 per cent – from 65,523 to 142,486 by the end of October. The biggest Terrogram groups have accrued over 4,000 followers in under a year.

Another reason for this surge, the report explains, has been that many major social media companies have grown stricter, particularly after the far-right rally in Charlottesville. (In March of last year, for instance, Facebook said it was banning white nationalist and white separatist content from its platform.) These banned users often migrate to the comparative safe-haven of encrypted Telegram channels.

“Telegram affords them far more privacy than they would get on other social media platforms,” explains Bharath Ganesh, a political geographer at the University of Groningen. “A private Facebook group can still be useful, but they know that Facebook will be monitoring that content.”

To be clear, though the Terrogram groups are highly active, they are relatively small. Yet they still represent a serious risk, says Lawrence. ”There are organisations that have formed that have been linked to murders – like Atomwaffen, which is linked to five murders in the US – that are using these platforms to get their message out.”

They also represent the most extreme end of far right activism. “It’s important to understand that this is one slice of a very, very big pie,” says Ganesh. “The bigger context is that we are seeing more and more of these quote-unquote ‘lone actors’, who aren’t always deeply engaged in these kinds of Telegram spaces, but tend to be drawing information and content that they find online from a wide variety of places.”

Nevertheless, we should regard these groups and their rhetoric as part of the same ecosystem inhabited by less extreme, more socially acceptable far-right activists on mainstream platforms, like Twitter. “On mainstream platforms, far right actors use emotion, particularly rage and anger, to spread extreme world views,” says Ganesh, citing research he published this year. “While neo-Nazis are quite explicit about their support for white supremacy on Telegram, on platforms like Twitter we see that far right actors instead try to argue that white identity is under attack.”

Request for comment from Telegram was made (through Telegram) and while the company declined to comment on the record, it is understood that it may take action against the channels. Hope not Hate argues that Telegram must hold itself responsible for the actions people who use its platform, and take the same hardline stance on Nazi groups as they have begun to with the so-called Islamic State and al-Qaeda since November 2019.

“Privacy and freedom of speech are legitimate concerns, but technology companies have to take responsibility for stopping their platform being used for the open distribution of material that is used to propagate, prepare and celebrate far-right terrorist activity – including murder and sexual violence,” says Nick Lowles, chief executive of Hope not Hate.

“It’s direct calls for terrorism again, and again and again and again, over and over and over,” says Lawrence. “And yet there seems to be little to no interest from Telegram on cracking down on this hardcore Nazi content. And it’s dangerous.”

Will Bedingfield is a staff writer for WIRED. He tweets from @WillBedingfield

This article was originally published by WIRED UK