Skip to Content
Prime Minister Jacinda Ardern with French President Emmanual Macron, at the Christchurch Call summit in Paris in May.
By Distinguished Professor Paul Spoonley
The latest far right terrorist attack in Halle, Germany, has confirmed several things. Firstly, that the Christchurch mosque shootings have provided something of a model for other white supremacists around the world.
Following on from Anders Breivik in Norway, and Christchurch, the aim are mass casualties from target groups. Also, following both Breivik and Christchurch, a manifesto is provided that employs white supremacist and ultra-nationalist rhetoric to justify what is about to happen. As the Halle shooter noted, he wanted to “strengthen the morale of other suppressed whites”.
At the core of these arguments is the notion that "whites" are being replaced – demographically and in terms of national sovereignty and identity – usually, it is argued, by Muslims, although this is often associated with a conspiracy theory that it is being orchestrated by a Jewish elite. As marchers in Charlottesville chanted, "Jews will not replace us.”
Perhaps the only difference in the major attacks committed in 2019 is that the target has been different – Muslims in Christchurch, immigrants in El Paso, Jews in Pittsburgh and Halle.
And then there is the livestreaming, in the Christchurch case, onto Facebook.
The “Christchurch Call” is an attempt to seek international co-operation, involving both the major online platforms and other countries and agencies, to monitor and act against extreme racist content and violence in cyber space. And it was great to see the New Zealand Censor acting so fast to deem the Halle shooting video as objectionable. But will these actions be enough?
In the Halle case, the video was streamed by Twitch, a subsidiary of Amazon. The material was removed from Twitch after 30 minutes, and with about 2000 views. However, and this is where it is going to get challenging, that was not the end of the video’s circulation.
A number of smaller platforms or sub-channels then got involved and posted the material from the Halle shooter. One researcher has estimated that there have been more than 50,000 views subsequently – and presumably this is growing.
As we heard last week at a meeting in Melbourne to discuss violent extremism, hosted by Hedayah (a global centre for countering violent extremism) and Deakin University, there are new online options for the extreme right.
The decentralisation of online platforms has generated platforms that can be hosted by individuals or groups using new software – and to not be reliant on major platforms like YouTube, Facebook or Twitter. The Pittsburgh shooter used Gab; others use Telegram, but there are many more. One estimate is that there are at least 150 catering to far right groups and ideologies, with 100 of those established this year.
Platforms such as Telegram and Gab claim to be free speech sites which do not censor the material being posted and which resist any sort of external intervention or regulation.
Distinguished Professor Paul Spoonley
The point is that while these sub-channels often have small audiences and reach, they are part of the online ecosystem that allows extremist groups to recruit, and to circulate their ideology and tactics internationally. And they are not subject to moderation or regulation. As one expert in Melbourne noted, they are “takedown resistant”.
This year has confirmed that Christchurch has proven to provide something of a model for other extremists. It was not a one-off.
Secondly, tactics and options are changing for extremists. At the moment, when major platforms like Facebook are doing more to manage content and as countries and agencies such as the European Court of Justice impose new requirements, the challenge is going to be to manage self-hosted and dispersed sites that cater specifically for extremist groups and activists.
There is growing evidence that what happens online has real-world consequences. Research by Karsten Müller and Carlo Schwarz have shown that there is a real-time correlation between an increase in racist and hate speech on Twitter and hate crimes directed at religious and ethnic minorities. Equally, when there are internet outages in countries like Germany or the USA, then the number of hate crimes goes down.
In the week after the Christchurch mosque shootings, the UK saw a spike in hate crimes with 95 recorded. Eighty-five referenced the Christchurch shootings. The El Paso shooter praised what happened in Christchurch.
The online ecosystem that encourages racial and religious vilification, and provides both rhetoric and tactics, is proving difficult to counter. It is becoming more difficult as the far right migrates from using major platforms to those that are sympathetic to their cause – and which are not easily subject to regulation or removal.
Distinguished Professor Paul Spoonley is from the College of Humanities and Social Sciences at Massey University. His research focuses on white supremacy movements, racism, immigration and population.
Created: 17/10/2019 | Last updated: 17/10/2019
Page authorised by Corporate Communications Director