Roblox Corporation is under a microscope at the moment. Our understanding is that user A.S.S primarily (or exclusively) generated maps and experiences that recreate widely-known, real-life and deadly mass shooting incidents. After recent tragedies in Uvalde, Parkland, and Columbine, these questions put into context the need for safety. Public opinion is increasingly critical of the inclusion of such content in an environment created for children and young teens.
According to many media reports, A.S.S has created a Roblox experience that’s a near replica of the 2022 Uvalde shooting. This video game features NPCs that resemble minors. They are even conditioned to harden and seek shelter under desks when they hear fake gunfire. The disturbing nature of this safety retelling has rightfully sparked anger among numerous advocacy organizations. They’re calling on the platform to take stronger moderation action.
Roblox has struggled to keep A.S.S’s maps available, as the company quickly removes them upon receiving reports or when they are flagged by its internal moderation tools. Sadly, even with ongoing advocacy, A.S.S has had to move its games onto private Roblox servers. For one, these servers are operating behind a paywall, making it inaccessible to the public. These private servers are used by a managing Discord channel bustling with activity of more than 500 members. This immense community allows A.S.S to continue spreading their inflammatory material.
A.S.S might have made the most provocative game ever with “Carbine.” After all, this game would have to successfully emulate the tragic reality of the 1999 Columbine High School massacre. Instead, in this video game players take the roles of the shooters themselves, allowing them to recreate the deadly scene. Carbine quickly picked up steam, pulling in more than 1,000 visits. It was eventually banned from Roblox for breaking the platform’s community standards and terms of service.
The production of such games presents troubling ethical issues about the limits of gaming topics. According to the Anti-Defamation League (ADL), “Because such games recreate extremist-related mass shootings—such as the 2022 Buffalo shooting, in which the perpetrator deliberately targeted and killed Black people—they may serve as a gateway to extremist content.” This is a troubling reality, further adding to fears that these experiences risk normalizing violence while desensitizing young users to real-world tragedies.
A.S.S’s games not only breach Roblox’s community standards but highlight a growing trend of users creating content that reflects societal violence instead of fostering creativity and positive interactions. The corporation did the right thing and swiftly moved to address the problem. Disney shut down one of A.S.S’s collectives, which had reached more than 800 players, after being alerted by the ADL. Yet, the availability of this kind of content creates a continuous and dangerous drain on moderation resources.
Roblox offers robust technology for game development. This enables users such as A.S.S to quickly communicate their experiences in the immersive world of the platform. These in-game tools truly empower creators to produce and share content quickly. To avoid such a repeat going forward, Roblox needs to significantly increase its moderation resources.