Researchers from Anglia Ruskin University have sounded the alarm on “gaming-adjacent platforms” including Discord, Twitch, and Steam being used as “digital playgrounds” to funnel new recruits into far-right and other extremist ideologies – with a focus on those showing interest in “hyper-masculine gaming titles.”
“These gaming-adjacent platforms offer extremists direct access to large, often young and impressionable audiences and they have become a key tool for extremist recruitment,” said co-author Dr William Allchorn, a senior research fellow at the university.
“Social media platforms have attracted most of the attention of lawmakers and regulators over the last decade, but these platforms have largely flown under the radar, while at the same time becoming digital playgrounds for extremists to exploit. The nature of radicalization and the dissemination of extremist content is not confined to any single platform and our research identified a widespread lack of effective detection and reporting tools.”
Allchorn, along with Dr Elisa Orofino, co-author and colleague at Anglia Ruskin’s International Policing and Public Protection Research Institute (IPPPRI), found that far-right extremism – including content that promotes white supremacy, neo-Nazism, antisemitism, misogyny, racism, homophobia, and conspiracy theories including but not limited to QAnon – was the most common, with Islamist extremism and what the researchers call “extremist-adjacent material,” including the glorification of school shootings, less common but still notable.
Unsurprisingly, these materials were usually found alongside “hyper-masculine gaming titles” – first-person shooters in particular. Extremists target those interested in these games, the researchers found, then begin to “funnel” them to less-regulated platforms.
“That’s where you have matchmaking,” an interviewee in the study said about gaming-adjacent social platforms. “It’s where you can build quick rapport with people. But that’s the stuff that very quickly moves to adjacent platforms, where there’s sort of less monitoring.”
It’s not a new problem, but little progress is being made. A 2023 report penned by researchers from the Psychological Defence Research Institute at Lund University, supported by the Swedish Psychological Defence Agency, identified six key ways extremists were making use of games and related technology: to reframe reality, including the gamification of real-life situations; the projection of authority, including data harvesting; psychographic targeting using said harvested data; hacking and phishing to gain access to third-party systems; social propaganda through in-game and game-adjacent social features; and interactive propaganda, in which game-related social communities are used to radicalize and mobilize players for the extremists’ own ends.
- xAI’s Grok lurches into right-wing insanity, offers tips on assaulting man
- War Games: MoD asks soldiers with 1337 skillz to compete in esports
- UK Online Safety Act ‘not up to scratch’ on misinformation, warn MPs
- Does UK’s Online Safety Act cover misinformation? Well, that depends
“Compared with social media, the gaming domain has insufficient policies and mechanisms to cope with information influence campaigns,” said co-author Jesper Falkheimer, foreshadowing the same conclusion reached by Allchorn and Orofino. “Nor are there sufficient avenues for researchers, journalists, and the industry itself to better understand the degree to which gaming platforms are currently being exploited. In other words, not only do we not know how serious the situation is, we also lack the means to find out.”
Allchorn concluded of potential ways to address the issue: “Many users don’t know how to report extremist content, and even when they do, they often feel their concerns aren’t taken seriously.
“Strengthening moderation systems, both AI and human, is essential, as is updating platform policies to address content that is harmful but technically lawful. Decisive action works and platforms can be doing more to help curb the spread of extremism.”
The full study is available under open-access terms in the journal Frontiers in Psychology.
Neither Discord, Valve, nor Twitch responded to requests for comment. ®