dScryb, the Toronto-based publisher of descriptive boxed text, has launched an experimental multimodal smart search engine designed to improve how gamemasters navigate its extensive library of human-authored content. Developed in collaboration with Andrew Zhu, a PhD researcher at the University of Pennsylvania, the system moves away from traditional keyword matching in favour of semantic search, which identifies results based on meaning, mood, and context.

The new engine, currently live within the Opus web application, utilises a Joint Embedding Space to convert text, audio, and visual media into mathematical vectors. This makes life easier for people; for example, a search for the term “jittery” can now successfully surface audio files such as “Mineshaft Elevator” or “Eldritch Combat,” even if those files do not contain the word “jittery” in their metadata. Unlike generative AI models that create synthetic assets, this implementation uses machine learning to index and retrieve dScryb’s catalogue of over 16,000 human-written scenes and 8,400 sound files.
Andrew Zhu, known in the roleplaying games industry as the lead developer of the Avrae Discord bot, brings significant expertise in Natural Language Processing to the project. His involvement signals a continued focus on integrating complex computational research into practical tabletop tools. While the search is experimental and currently undergoing A/B testing, dScryb has maintained a “legacy search” toggle to ensure users can still access the library through traditional methods during the refinement period.
The search upgrade is intended to alleviate the pain of fruitless, frustrating scrolling often associated with massive digital libraries. By allowing gamemasters to use natural language to find a specific atmosphere rather than exact titles, the tool aims to reduce the time spent in menus during active play sessions. dScryb has indicated that while the search is currently limited to Opus, it is expected to roll out across the wider site in the coming months.
The developers are currently soliciting user feedback via “thumbs up” and “thumbs down” interactions to refine the relevance of the results as development continues.
Quick Links
Independently covering dscryb since 2021. Our archive includes 8 entries connected to this topic.
Latest entry: May 2026