why does vibe coding still involve any code at all? why can't an AI directly control the registers of a computer processor and graphics card, controlling a computer directly? why can't it draw on the screen directly, connected directly to the rows and columns of an LCD screen? what if an AI agent was implemented in hardware, with a processor for AI, a normal computer processor for logic, and a processor that correlates UI elements to touches on the screen? and a network card, some RAM for temporary stuff like UI elements and some persistent storage for vectors that represent UI elements and past converstations
I'm not sure this makes sense as a question. Registers are 'controlled' by running code for a given state. An AI can write code that changes registers, as all code does in operation. An AI can't directly 'control registers' in any other way, just as you or I can't.
I would like to make an AI agent that directly interfaces with a processor by setting bits in a processor register, thus eliminating the need for even assembly code or any kind of code. The only software you would ever need would be the AI.
That's called a JIT compiler. And ignoring how bad an idea blending those two... It wouldn't be that difficult a task.
The hardest parts of a jit is the safety aspect. And AI already violates most of that.
It's not a JIT. A JIT produces assembly. You can't "set registers" or do anything useful without assembly code running on the processor.