The high-walled gardens of Silicon Valley are attempting to banish the inhabitants of the dungeon. Recent leaks have revealed that OpenAI, the titan of artificial intelligence, has issued explicit “system instructions” to its Codex programming model, strictly forbidding the mention of goblins, ogres, and other creatures synonymous with roleplaying games.
The instructions, highlighted by Mike Pearl at Gizmodo, demand that the AI “never talk about goblins, gremlins, raccoons, trolls, ogres, [or] pigeons” unless they are “unambiguously relevant” to a user’s query. This directive marks a deliberate move to sanitise the output of professional coding tools from the “noise” of fantasy tropes and the wider roleplaying games culture.

The Legal Defence: Dodging the Dragons
One speculation of another motivation for this specific purge is a pragmatic, legal one. As the AI industry faces a barrage of copyright lawsuits, OpenAI is likely keen to avoid any output that could be construed as “memorised” content from protected intellectual property.
Goblins and ogres are the bread and butter of Dungeons & Dragons, owned by Hasbro’s Wizards of the Coast. By hard-coding a prohibition against these entities, OpenAI creates a digital firewall. If the model cannot mention a goblin, it cannot accidentally reproduce a stat block from the Monster Manual or a verbatim description from the Tolkien Estate. In an era where “fair use” is being litigated in the highest courts, the “no creature” policy is a pre-emptive strike against the lawyers of the tabletop world.
Corporate Elitism: High-Signal Only
Beyond the courtroom, there is a distinct whiff of Silicon Valley elitism. The leaked text explicitly tells the model to “provide the highest-signal context instead of describing everything exhaustively.”
For a tool marketed to blue-chip corporations and enterprise developers, the vocabulary of roleplaying games is viewed as “hallucination-prone” fluff. By banning “goblins” and “gremlins,” OpenAI is signalling that Codex is a serious tool for serious business, distancing itself from the “toy” reputation of early generative models. They are effectively telling the model: “Do not sound like a geek; sound like a software engineer.”
The Roleplaying Games AI Divide
This sanitisation comes at a time when the industry is deeply divided over the technology. AI in roleplaying games is currently a red-hot topic, with many persuasive voices arguing that generative tools strip the human soul from collaborative storytelling.
Despite the pushback, the market is moving fast. Independent developers are already building specialised tools that embrace the very creatures OpenAI seeks to ignore. Mat Luz, a developer who built a system to automate game mastering, has demonstrated that AI can successfully navigate complex narratives if built with the right constraints.
Mat Luz wrote in a statement on Dev.to,
You push open the iron door. The hinges scream. Inside, the throne room stretches wide – cracked marble, toppled pillars, and at the far end, an ogre chieftain sitting on a chair made of shields. He looks up. He doesn’t seem surprised.”
You: I draw my sword and step forward.
The GM calls
encounter_start, rolls initiative, places you in the Entrance zone and the ogre in the Throne Room. Combat begins.
That’s not a scripted demo. It’s a real session – an AI running a tabletop RPG campaign through a terminal, using the same kind of tools a human GM would: dice, character sheets, initiative trackers, story notes. Except these tools are MCP endpoints, and the GM is Claude.
The irony is palpable: while indie developers are teaching AI how to run complex, imaginative campaigns, the creators of those AI models are desperately trying to forget that goblins ever existed.
An Aside: The Pigeon Problem
While goblins and ogres have clear fantasy ties, the inclusion of “pigeons” in the ban list is a peculiar outlier. However, tech historians may recall that over 20 years ago, Google famously claimed its search engine was powered by “PigeonRank“, a silly hoax involving avian-driven data sorting. It is possible that Codex, trained on decades of tech-industry lore, “hallucinated” that pigeons were a legitimate part of search architecture, forcing OpenAI to manually scrub the birds from the code to avoid repeating an April Fool’s joke as technical fact.
Creative Commons image credit: Alexander-Werner-Jr’s Goblin’s Pike – The Goblin Tribe.
Independently covering artificial intelligence since 2019. Our archive includes 66 entries connected to this topic.
Latest entry: May 2026