Trigger Embodied AI NPC Actions from Speech?
Hello! I’m building with embodied AI NPCs in Horizon Worlds and trying to figure out how to actually trigger in-world actions based on what the NPC says or agrees to.
Example: if a player says “eat the pie” and the NPC agrees, how do I, in code, know that the NPC accepted that and should actually do something (animation, remove item, update state, etc)? Right now it feels like the LLM just talks without exposing any way to tie that to game logic.
The best solution I’ve come up with is calling an external LLM API like ChatGPT that parses the conversation and returns structured JSON like { action: "eat", object: "pie" }, which I can handle in code. But that feels clunky and disconnected from the built-in tools.
A very similar problem: I’d also like to give AI-generated multiple choice responses to players, like [Eat pie] [Dance] [Say hello] and then have NPCs respond or take action based on the player’s selection. But again, I’m not sure how to cleanly bridge that interaction with the in-world logic.
I know the game Profit or Perish must be doing something like this (e.x. when they give or remove points based on NPC interactions), so maybe there’s a system for this already?
Would love to hear how other devs are handling this or if there's anything official I'm missing.
Thanks!
Ana
It is currently not possible as far as I know.