←back to thread

443 points wg0 | 1 comments | | HN request time: 0s | source
Show context
kwanbix ◴[] No.45903174[source]
This has happened more than once, in different scenarios.

I wonder why is it that GhatGPT (and the rest) don't put the actual response in a box that is separate from the "hello" and "goodbye" part.

I once wrote a message to my landlord, and I asked ChatGPT to help me make the message better (being that english is not my mother tongue) and I included the "goodbye" part by mistake.

replies(1): >>45903371 #
advisedwang ◴[] No.45903371[source]
LLM's don't have any internal concept of "actual response" vs "surrounding words". Just like they don't have a internal concept of system prompt vs user input. Just like they don't even have an internal concept of what the LLM emitted vs what was given to it! It's all just one long sequence.

(Yes, it is possible to create tokens to represent category changes, but this is still in-band. the token is still just part of the sequence and the LLM isn't guaranteed to factor it in correctly)

replies(1): >>45903624 #
1. kwanbix ◴[] No.45903624[source]
Thanks, so how is it possible for the Chatbot to box a Python response? Can't they use the same technique?