←back to thread

GPT-5.2

(openai.com)
1019 points atgctg | 4 comments | | HN request time: 0s | source
1. scottndecker ◴[] No.46235829[source]
Still 256K input tokens. So disappointing (predictable, but disappointing).
replies(2): >>46237050 #>>46239554 #
2. htrp ◴[] No.46237050[source]
much harder to train longer context inputs
3. coder543 ◴[] No.46239554[source]
https://platform.openai.com/docs/models/gpt-5.2

400k, not 256k.

replies(1): >>46241323 #
4. nathants ◴[] No.46241323[source]
400 - 128 = 272. Codex cli source.