←back to thread

1480 points sandslash | 1 comments | | HN request time: 0.215s | source
Show context
alightsoul ◴[] No.44315969[source]
why does vibe coding still involve any code at all? why can't an AI directly control the registers of a computer processor and graphics card, controlling a computer directly? why can't it draw on the screen directly, connected directly to the rows and columns of an LCD screen? what if an AI agent was implemented in hardware, with a processor for AI, a normal computer processor for logic, and a processor that correlates UI elements to touches on the screen? and a network card, some RAM for temporary stuff like UI elements and some persistent storage for vectors that represent UI elements and past converstations
replies(4): >>44315999 #>>44316015 #>>44316024 #>>44316162 #
flumpcakes ◴[] No.44315999[source]
I'm not sure this makes sense as a question. Registers are 'controlled' by running code for a given state. An AI can write code that changes registers, as all code does in operation. An AI can't directly 'control registers' in any other way, just as you or I can't.
replies(2): >>44316018 #>>44316020 #
singularity2001 ◴[] No.44316018[source]
what he means is why are the tokens not directly machine code tokens
replies(1): >>44318844 #
1. flumpcakes ◴[] No.44318844[source]
What is meant by a 'machine code token'? Ultimately a processor needs assembly code as input to do anything. Registers are set by assembly. Data is read by assembly. Hardware is managed through assembly (for example by setting bits in memory). Either I have a complete misunderstanding on what this thread is talking about, or others are commenting with some fundamental assumptions that aren't correct.