Friday, August 1, 2025
HomeGamingHow harmful content is evading detection on popular video gaming sites

How harmful content is evading detection on popular video gaming sites

gaming
Credit: Unsplash/CC0 Public Domain

New research published in the journal Frontiers in Psychology reveals how extremist groups are exploiting the popularity of video games to recruit and radicalize impressionable users.

The study shows that gaming-adjacent platforms, which allow users to chat and while playing, are being used as “digital playgrounds” for extremist activity and that video game players are being deliberately “funneled” by extremists from mainstream to these sites, in part because of the challenges faced in moderating them.

The research was carried out by Dr. William Allchorn and Dr. Elisa Orofino, senior research fellows at Anglia Ruskin University’s International Policing and Public Protection Research Institute (IPPPRI), and includes interviews with platform content moderators, tech industry experts and those involved in preventing and countering .

It found that far-right extremism is the most common ideology shared on these gaming-adjacent platforms. This includes content promoting , neo-Nazism and anti-Semitism, often accompanied by misogyny, racism, homophobia and conspiracy theories, including references to QAnon.

Islamist extremism was also reported, though less frequently, alongside “extremist-adjacent” material such as the glorification of school shootings—all content that violates the terms of service of mainstream platforms but often evades detection.

The study explains that hyper-masculine gaming titles, such as first-person shooter games, have particular appeal to extremists, and highlights how the unique nature of online gaming brings together strangers with a common interest.

After initial contact, funneling takes place where interactions move to the less regulated gaming-adjacent platforms, providing an environment where extremists can socialize, share propaganda and subtly recruit.

One interviewee in the study explained how grooming might start: “That’s where you have matchmaking. It’s where you can build a quick rapport with people. But that’s the stuff that very quickly moves to adjacent platforms, where there’s sort of less monitoring.”

A recurring concern among participants was the danger of younger users coming under the influence of extremist influencers, who combined streaming live game play with extremist narratives.

Participants highlighted that needs to better understand how these platforms and their subcultures operate, and also emphasized the importance of educating parents, teachers and children about the risks of online radicalization.

Moderators who took part in the study expressed frustration at inconsistent enforcement policies on their platforms and the burden of deciding whether content or users should be reported to local law enforcement agencies.

In-game chat is unmoderated, but the moderators still report being overwhelmed by the volume and complexity of harmful content, including the use of hidden symbols often used to circumvent banned words.

AI tools are being used to assist with moderation, but they struggle to interpret memes or when language is ambiguous or sarcastic. Phrases such as “I’m going to kill you” may be common in gameplay, but difficult for automated systems to interpret in context.

Co-author of the study, Dr. William Allchorn, Senior Research Fellow at Anglia Ruskin University (ARU), said, “These gaming-adjacent platforms offer extremists direct access to large, often young and impressionable audiences and they have become a key tool for extremist recruitment.

“Social media platforms have attracted most of the attention of lawmakers and regulators over the last decade, but these platforms have largely flown under the radar, while at the same time becoming digital playgrounds for extremists to exploit.

“The nature of radicalization and the dissemination of extremist content is not confined to any single platform and our research identified a widespread lack of effective detection and reporting tools.

“Many users don’t know how to report extremist content, and even when they do, they often feel their concerns aren’t taken seriously. Strengthening moderation systems, both AI and human, is essential, as is updating platform policies to address content that is harmful but technically lawful. Decisive action works and platforms can be doing more to help curb the spread of extremism.”

More information: Policing Extremism on Gaming Adjacent Platforms: Awful but Lawful?, Frontiers in Psychology (2025). DOI: 10.3389/fpsyg.2025.1537460

Citation: How harmful content is evading detection on popular video gaming sites (2025, July 31) retrieved 31 July 2025 from https://phys.org/news/2025-07-content-evading-popular-video-gaming.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Previous article
Next article
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Popular

Recent Comments