←back to thread

780 points rexpository | 1 comments | | HN request time: 0.205s | source
1. tudorg ◴[] No.44508136[source]
Another way to mitigate this is to make the agents always work only with a copy of the data that is anonymized. Assuming the anonymisation step removes / replaces all sensitive data, then whatever the AI agent does, it won't be disastrous.

The anonymization can be done by pgstream or pg_anonymizer. In combination with copy-on-write branching, you can create a safe environments on the fly for AI agents that get access to data relevant for production, but not quite production data.