@PerogiBoi@jherazob it would be interesting but require a lot of development to make sure the NPCs either didn’t know about spoiler information which may break the plot or don’t just hallucinate answers, which may mislead the player.
“How do you get through the haunted forest?” “You need x item to get through the haunted forest” “are you sure?” “Yes thats how heros get through the forest” the item in question doesn’t even exist in the game or has no bearing on the quest.
You might be interested in https://inworld.ai/origins , a detective game where all the characters can be interviewed in natural language and respond with AI. They seem to be doing a pretty good job so far
@Ferk@jherazob@PerogiBoi good point, unreliable narrator is one thing, but could harm game enjoyment especially if it’s unintended or harmful. It’s one thing to retell the history of a region with a bias or mis-remembering events, or characters lying because it’s their nature to lie “evil character” but it would get annoying if every character could convincingly just make up unhelpful rubbish, or spoil a plot twist in the game.
@Ferk@jherazob@PerogiBoi I’m not arguing against LLM or conversational AI in games, it could really breathe life into a game if your choices really could have organic responses, but these tools have a lot of pitfalls that scripted responses don’t have, and the dev team will need to be aware of it to not have unintended consequences.
Such AI integration will be separated into categories of "pre-generated" content that is "created with the help of AI tools during development" (e.g., using DALL-E for in-game images) and "live-generated" content that is "created with the help of AI tools while the game is running" (e.g., using Nvidia's AI-powered NPC technology).
Both are covered by the policies the article talks about, and both were arguably against the rules previously