U.S. policymakers and media are shifting their focus towards gaming and gaming-adjacent platforms like Discord, Roblox, and Steam, where young people increasingly interact socially, often anonymously and out of public view.
These platforms, originally designed for gamers, have evolved into primary hubs for social discourse and authentic interaction. This is particularly significant as mainstream applications prioritize content engineered for virality, making gaming platforms more appealing for genuine connections. However, this evolution has brought new scrutiny, with a focus on how these closed forums can harbor hate and exploitation, problems that often remain concealed until they manifest in real-world harm.
Mainstream social media apps are structured to elevate and amplify content publicly, making them effective tools for spreading ideologies, rumors, or disinformation. Many of these ideas first take root in smaller, more private forums on gaming platforms. Extremist groups removed from mainstream services have found new homes in these spaces. Unlike public-facing apps, users on gaming platforms are more accustomed to operating with pseudonyms, facilitating the anonymous exchange of radical and taboo ideas.
Mariana Olaizola Rosenblat, a policy advisor on tech and law at NYU Stern, notes that the architecture of these gaming-focused platforms is a key factor in the proliferation of dangerous content. “Extremists and predators go to these gaming spaces to find highly-engaged, susceptible young people, many of whom are yearning for connection,” she says.
The smaller, private chat rooms where harmful conversations develop are typically sealed off from outside observers. “Most researchers are basically blind to all of this. You can’t enter these rooms,” Rosenblat observes. Users also leverage the gaming context, employing “gamespeak” to disguise extremist or dangerous concepts, blurring the distinction between role-playing and real-world intent. While the platforms themselves have technical access to this content, the sheer volume makes monitoring a significant challenge. Rosenblat notes that most have not invested sufficiently in safeguards or moderation resources to protect young users.
It’s worth noting that the majority of conversations on these platforms are ordinary, ranging from discussions about gaming and participation in study groups to sports fandoms and neighborhood community forums. Nevertheless, these environments have also become fertile ground for radicalization and exploitation, with a growing number of incidents bringing the issue to light.
Discord is facing renewed scrutiny after the suspect in the murder of Charlie Kirk appeared to confess within a Discord chat. The platform was also used by organizers of the 2017 Unite the Right rally in Charlottesville to coordinate logistics, including carpools and lodging. In a separate incident, the shooter who killed 10 people in a Black neighborhood of Buffalo in 2022 documented months of his planning in a private Discord chat. Additionally, a 2018 investigation by The Daily Beast uncovered hundreds of instances of revenge porn being shared across Discord servers.
Roblox, a platform marketed directly to children, has drawn sharp criticism for the sexual, predatory, and extremist content that appears on its service. The company is currently facing multiple lawsuits. One lawsuit, filed by the state of Louisiana, alleges that Roblox failed to protect children. Another was filed by an Iowa family after their 13-year-old daughter was kidnapped, trafficked, and raped by a predator she met on the platform.
A Roblox spokesperson provided a statement regarding its safety practices: “While we cannot comment on claims raised in litigation, at Roblox, we strive to hold ourselves to the highest safety standards. We invest significant resources in advanced safety technology, including a combination of machine learning and human moderation teams working 24/7 to detect and address inappropriate content and behavior.”
The 2022 Buffalo shooting was livestreamed on Twitch, the Amazon-owned gaming platform. Twitch condemned the attack, removed the video quickly, and stated that it was working closely with law enforcement to investigate the incident.
In 2021, researchers found that the platform Steam had become a networking space for far-right ideologies, hosting groups that promoted neo-Nazi organizations.
Following the killing of Charlie Kirk, gaming platforms are receiving increased attention from U.S. policymakers. House Oversight Chair James Comer (R-Ky.) has asked the CEOs of Discord, Steam, Twitch, and Reddit to testify before Congress on October 8 about the issue of user radicalization on their platforms.




