There are some significant issues with it at the moment. One is that you have to train on vast swathes of text to get an LLM, and it's difficult after the fact to remove things after the fact. If you cooperate with the AI and stay "in Skyrim" with what you say to them it works out OK, but if you don't cooperate it becomes clear that Skyrim NPCs know something about Taylor Swift and Fox News, just to name two examples. LLMs in their current form basically can't solve this.
The LLMs are also prone to writing checks the game can't cash. It's neat that the NPCs started talking about a perfectly plausible dungeon adventure they went on in a location that doesn't exist, but "felt" perfectly Skyrim-esque, but there's clearly some non-optimal aspects about that too. And again, this is basically not solvable with LLMs as they are currently constituted.
Really slick experiences with this I think will require a generational change in AI technology. The Mantella mod is fun and all but it would be hard to sell that at scale right now as a gaming experience.
I didn't go into it in detail, but it isn't even that I got the NPCs to start babbling about Taylor Swift. What is was was just that they knew that she was a musician, and as such, might be at the tavern. That's very hard to remove.