←back to thread

612 points meetpateltech | 1 comments | | HN request time: 0s | source
Show context
leonidasv ◴[] No.42952286[source]
That 1M tokens context window alone is going to kill a lot of RAG use cases. Crazy to see how we went from 4K tokens context windows (2023 ChatGPT-3.5) to 1M in less than 2 years.
replies(6): >>42952393 #>>42952519 #>>42952569 #>>42954277 #>>42958220 #>>42975332 #
1. Alifatisk ◴[] No.42952393[source]
Gemini can in theory handle 10M tokens, I remember they saying it in one of their presentations.