I doubt Comet was using any protections beyond some tuned instructions, but one thing I learned at USENIX Security a couple weeks ago is that nobody has any idea how to deal with prompt injection in a multi-turn/agentic setting.
replies(1):
There is no generally safe way of escaping LLM input, all you can do is pray, cajole, threaten or hope.