Thursday, August 7, 2025
HomeGamingFar-right extremists recruit over gaming platforms, researchers say

Far-right extremists recruit over gaming platforms, researchers say

A pair of U.K. researchers has identified video games as key targets for recruitment by far-right extremists, in part because they say such platforms are so difficult to police.

Published Thursday, a paper from the International Policing and Public Protection Research Institute (IPPPRI) examined the political discourse circulating on video-game streaming platforms and third-party messaging apps.

In interviews with 13 content moderators, industry experts and members of anti-extremism organizations, researchers heard that far-right messaging was the most commonly spread, including neo-Nazism, attacks on women, racialized and LGBTQ2S+ communities, and the spread of conspiracy theories like QAnon.

“These gaming-adjacent platforms offer extremists direct access to large, often young and impressionable audiences,” co-author William Allchorn said in a release. “They have become a key tool for extremist recruitment.”

In March of 2024, Canada’s federal government announced new funding to research political indoctrination in online gaming, including how communities built out of games can “create environments conducive to radicalization to violent extremism.”

“There is clear evidence that extremist actors are active in these spaces,” reads a report published later that year by the international Extremism and Gaming Research Network, one of the recipients of the funding.

Last August, the RCMP issued a safety advisory to parents about the dangers of online extremists who target children, including in video games and related communities.

“We ask parents, guardians and adults in positions of authority to keep an eye out for indicators that a child or youth is being targeted or exploited,” the release reads.

“Your report could be the missing piece to preventing more harm.”

Problems with prevention

The IPPPRI researchers note that indoctrination can begin within the games, where opportunities are plentiful for a political agitator in search of recruits.

Multiplayer matchmaking offers a steady flow of new contacts, shared interest in the game helps to build rapport quickly and for some titles, including the ever-popular first-person shooters, “hyper-masculine” themes present a bridge to discussing more radical thinking, the paper notes.

Inappropriate content is often filtered out of in-game messages and players can easily get removed from a given match for violating the rules, but those chats are sometimes only used at the beginning of the recruitment process. Once a target has been drawn in, researchers say the conversation is prone to move to third-party chat groups, where extremist propaganda flows more freely — a process known as “content funnelling.”

“These platforms have largely flown under the radar, while at the same time becoming digital playgrounds for extremists to exploit,” Allchorn said. “Our research identified a widespread lack of effective detection and reporting tools.”

Interviewees said among the far-right messaging were other forms of fringe politics, including Islamist extremism, and what they call “extremist-adjacent” material, such as the “glorification” of mass shootings at schools.

Though much of this kind of content is often against platforms’ terms of service, enforcement may rely on users reporting one another for banned conduct; something Allchorn said some don’t even know how to do.

Efforts to employ AI to automatically respond to extremist content are complicated, the analysis found. Violent words, for example, might just be used to harmlessly refer to what’s happening in the games, and the highly contextual dialect of memes, irony and inside jokes found in online communities can easily confuse automated tools.

Even keeping up with the growing list of alternative terms and symbols used to circumvent language filters can be a consistent challenge, interviewees told researchers.

“Strengthening moderation systems, both AI and human, is essential, as is updating platform policies to address content that is harmful but technically lawful,” Allchorn said. “Decisive action works and platforms can be doing more to help curb the spread of extremism.”

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Popular

Recent Comments