Today, we’re excited to be launching our Unreal AI Runtime SDK - the first unified solution for game developers building realtime interactive AI experiences. No more stitching together multiple AI plugins, managing separate provider APIs, or spending all your time fighting AI instead of building your game.
With the Unreal AI Runtime, you can now:
- Start working immediately with foundational AI building blocks like speech-to-text (STT), text-to-speech (TTS), and LLMs
- Get access to hundreds of models across various model providers with a single API key
- Easily create AI pipelines, such as a speech to speech pipeline, with an intuitive visual graph editor
- Leverage pre-built templates for common use cases like AI NPCs and chatbots
- Manage and optimize costs, latency, and quality through built-in observability and experimentation tools
The Unreal AI Runtime SDK is live now and available for
download. We are also launching our Unity AI Runtime SDK for
early access.
Why we built this
Inworld has been building for game developers since day one — starting with lifelike, interactive characters that could form relationships, express emotions, and respond naturally through voice. Along the way, we saw the biggest challenges developers faced when building engaging, realtime AI experiences:
- Keeping up with dozens of models and providers - Delivering a rich, multimodal AI often requires testing dozens of different models. In addition to integrating each new provider, this means juggling separate APIs, rate limits, billing. The Unreal AI Runtime SDK unifies access to all major model providers under a single API key, so switching models is as easy as selecting from a dropdown.
- Easily customizing without reinventing the wheel - Every game has unique creative and technical needs. With pre-built templates (including our Character Template) and a modular design, you can customize or extend for your specific use case while starting from a production-ready, pre-optimized foundation. Plus, our intuitive visual graph editor makes it easy for anyone—writers, designers, or developers—to customize logic, prompts, and behavior that define your AI interaction.
- Debugging non-deterministic AI responses - AI systems can behave unpredictably, and debugging the root cause is notoriously hard. The Unreal AI Runtime's built-in observability tools, including logs, traces, and dashboards, make it easy to trace every step of your AI pipeline, understand latency, and pinpoint exactly where an issue occurred.
- Managing costs with scale - Scaling AI to millions of players can get expensive. The Unreal AI Runtime helps manage those costs with dashboards that provide a unified view of your cost drivers, experimentation tools that make it easy to test more efficient models or orchestrations, and support for running local models where it makes sense.
The Unreal AI Runtime was built to address these challenges for game developers, so you can spend more time building your game instead of building AI infrastructure.
What you can build
Engaging, conversational NPCs
Developer: Inworld team
SDK: Unreal AI Runtime
Motivation: Building a believable, real-time character is hard. Low-latency turn-taking and natural interruptions are notoriously difficult to get right. We also wanted to support MetaHumans and lipsync, with our streamed audio. Then there’s always the question of balancing LLM latency with quality - how do we ensure the dialogue is engaging and relevant without being too slow or expensive? We designed our Character, Metahuman, and Lipsync templates to handle all of this out of the box, with countless hours spent optimizing for performance, realism, and responsiveness. It’s the fastest way to build a fully interactive, voice-driven character in Unreal.
Generative Survivor’s game
Developer: Brian Cox, Shuang Liang
SDK: Unity AI Runtime
Motivation:I wanted to challenge myself to create a fully generative game. While today’s AI still struggles to build an entire game from scratch, I approached the problem using data-driven development. I created a core game template (in this case, a Survivor-like experience) and set up internal asset libraries covering 3D models, animations, VFX, music, fonts, and more. Each asset is tagged with metadata describing its visual and thematic attributes.
Using the Unity AI Runtime SDK, I built a conversational system where I can speak to an NPC (Merlin) and describe what type of game I want: player character, enemy types, starting weapon, environment, skybox, and so on. This voice input triggers a series of LLM service calls, each using specialized prompts to search the asset libraries and select the most fitting options.
Once the AI has chosen the final assets, the configuration is stored in a ScriptableObject. When the game returns to the main menu, the ScriptableObject is parsed and all required data is extracted to dynamically generate a customized Survivor-like game on the fly.
AI-powered decision making
Developer: Braeden Warnick
SDK: Unreal AI Runtime
Motivation: I wanted to show that while other AI-powered NPCs can obey commands based on what they can or can't do (like open locked doors), I wanted to explore making an NPC companion that can behave based on what they should do based on their character info. This also illustrates how you can adapt a custom Graph to leverage the evaluator power of an LLM in place of any scoring function used in any game system (e.g., procedural content generation, NPC decision making, game AI directors).
Get started with Inworld Runtime
To get started building today:
We’re excited to see what games and experiences you bring to life!