←back to thread

224 points azhenley | 5 comments | | HN request time: 0.51s | source
1. armcat ◴[] No.45075670[source]
When you use ChatGPT and it executes code, i.e. when you tell it to do something with a CSV file, it seems to run in a VM with certain tools and libraries available to it, and a sandboxed disk access; no internet access though. So it's kind of already there.
replies(3): >>45076177 #>>45076186 #>>45079490 #
2. Ancapistani ◴[] No.45076177[source]
That’s also how Devin works, and OpenHands.

The agent running in a VM - at least by default - was a key feature during the AI pilot I ran a few months ago.

3. hhh ◴[] No.45076186[source]
It runs in Kubernetes, on AKS (Azure Kubernetes) with some gvisor stuff. Though Jupyter if i recall.
replies(1): >>45076520 #
4. energy123 ◴[] No.45076520[source]
`to=python.exec` is how it runs python code, and `to=container.exec` is how it runs bash commands, with attached files showing up in `/many/data`. Unfortunately the stdout is heavily truncated prior to being shown to the model so it's not a hack for longer context via printing file attachment's contents.
5. valenterry ◴[] No.45079490[source]
Not really.

Now imagine you run two AIs (like ChatGPT) on your machine or on a server. You maybe even want them to cooperate on something. How do you do that? Right, you cannot, there is no standard, no interoperability, nothing.