←back to thread

584 points Alifatisk | 3 comments | | HN request time: 0.006s | source
Show context
dmix ◴[] No.46182877[source]
> The Transformer architecture revolutionized sequence modeling with its introduction of attention, a mechanism by which models look back at earlier inputs to prioritize relevant input data

I've always wanted to read how something like Cursor manages memory. It seems to have developed a long history of all of prompts and understands both the codebase and what I'm building slightly more over time, causing less errors.

replies(1): >>46182957 #
1. russdill ◴[] No.46182957[source]
That's not what they are talking about here. This is just a description of what goes on with a transformer and the context window
replies(1): >>46183117 #
2. dmix ◴[] No.46183117[source]
Ah so 'long-term memory' in this case is just really large context windows with a long series of user inputs. That makes sense.
replies(1): >>46183381 #
3. ◴[] No.46183381[source]