It's just simple validation with some error logging. Should be done the same way as for humans or any other input which goes into your system.
LLM provides inputs to your system like any human would, so you have to validate it. Something like pydantic or Django forms are good for this.
replies(1):