←back to thread

612 points meetpateltech | 1 comments | | HN request time: 0.211s | source
Show context
leonidasv ◴[] No.42952286[source]
That 1M tokens context window alone is going to kill a lot of RAG use cases. Crazy to see how we went from 4K tokens context windows (2023 ChatGPT-3.5) to 1M in less than 2 years.
replies(6): >>42952393 #>>42952519 #>>42952569 #>>42954277 #>>42958220 #>>42975332 #
1. xnx ◴[] No.42958220[source]
> That 1M tokens context window

2M context window on Gemini 2.0 Pro: https://deepmind.google/technologies/gemini/pro/