Artificial intelligence is now enabling video game players to have unique and unscripted conversations with in-game characters, moving beyond the traditional repeated lines of dialogue. This same experimental technology is currently transitioning from niche hobbyist projects into mainstream gaming applications.
Modding communities have been at the forefront of this development, turning classic games such as The Elder Scrolls: Skyrim and Fallout 4 into environments for AI interaction. Using a downloadable mod named Mantella, players can use a microphone to speak directly to non-playable characters (NPCs), who then provide dynamic responses. The technology relies on AI models released by companies including Google and Meta, creating new forms of in-game interaction.
The level of immersion has led some players to form deep connections with these AI-driven characters. One user on a forum described an experience with a Skyrim character named Uthgerd, with whom the player claimed to have role-played a birth. “We role played her water breaking and her giving birth which was wild … SkyrimVR is definitely therapy … when I play it I am instantly put in a good mood,” the user wrote.
Testing of the experience confirmed the technology’s functionality, with some AI-generated dialogue being difficult to distinguish from the original game’s scripted lines. However, achieving a smooth experience required significant investment and patience. Common issues included characters misunderstanding prompts or responding with notable delays. The experience can be enhanced by investing in higher-quality components. Paying for premium AI models can result in faster and more interactive characters, some even capable of recognizing objects in their environment through computer vision. Additionally, improved voice models can make NPCs sound more natural, and using a virtual reality (VR) headset can provide a more immersive experience.
Eddie, a YouTuber known as “Brainfrog,” documents his adventures speaking with AI NPCs in VR. He noted the potential for deep engagement. “You’re creating your own world with your own relationships and I found myself building genuine relationships with these characters,” Eddie said. He also acknowledged the technology’s current limitations: “There are a substantial amount of issues and times where you run into this stuff and you’re just like, oh, I’m speaking to a dumb computer, but there are some moments that everything aligns and it’s really, really breathtaking.”
This technology is now expanding beyond the modding community into mainstream games. This week, Meta provided developers for its Horizon Worlds VR platform with tools to create characters that can respond using AI. Epic Games also integrated an AI-voiced version of Darth Vader into its popular game Fortnite earlier this year. The company has announced plans to release custom AI characters by the end of the year.
AI labs are also working to advance these capabilities, focusing on virtual characters that can not only speak but also physically interact with their virtual surroundings. A recent report in the Financial Times stated that Elon Musk’s xAI is entering this field, competing with Meta and Google, and has been hiring specialists from Nvidia to support its development.
Despite the push from major tech companies, some figures within the video game industry have expressed skepticism. Michael Douse, the head of publishing at Larian Studios, the developer of Baldur’s Gate 3, commented this week that AI does not address the primary challenges in game development, which he identified as “leadership and vision.” On the social media platform X, he posted, “What this industry needs is not more mathematically produced, psychologically trained gameplay loops,” adding that it instead requires “more expressions of worlds that folks are engaged with, or want to engage with.”




