←back to thread

213 points Philpax | 1 comments | | HN request time: 0.2s | source
Show context
elieb44 ◴[] No.42170747[source]
How about context encoding more generally ? Are there techniques to do that. I.E, during training, I want the string "Dubito ergo cogito, cogito ergo sum, sum ergo Deus est." to have embedded René Descartes as main author, year 1637 as date of writing and "Discours de la méthode" as global context of writing.

So that when trained again another part of the same book, the model can learn they were from same context.

replies(1): >>42173624 #
1. jmmcd ◴[] No.42173624[source]
This is a good idea! The answer to my knowledge is no-one does this, we just the simplest, stupidest, possible method, which is to concatenate all the text in the world. That is during training, of course. At runtime, there is the system prompt.

The second simplest method might indeed use something like a system prompt with metadata like that, injected before the current window of text. But what would happen at runtime, when that metadata is not present? Probably performance would be much worse.