←back to thread

612 points meetpateltech | 1 comments | | HN request time: 0.206s | source
Show context
leonidasv ◴[] No.42952286[source]
That 1M tokens context window alone is going to kill a lot of RAG use cases. Crazy to see how we went from 4K tokens context windows (2023 ChatGPT-3.5) to 1M in less than 2 years.
replies(6): >>42952393 #>>42952519 #>>42952569 #>>42954277 #>>42958220 #>>42975332 #
1. monsieurbanana ◴[] No.42952519[source]
Maybe someone knows, what's the usual recommendation regarding big context windows? Is it safe to use it to the max, or performance will degrade and we should adapt the maximum to our use case?