←back to thread

GPT-5.2

(openai.com)
1053 points atgctg | 7 comments | | HN request time: 0.194s | source | bottom
1. xd1936 ◴[] No.46235269[source]
> While GPT‑5.2 will work well out of the box in Codex, we expect to release a version of GPT‑5.2 optimized for Codex in the coming weeks.

https://openai.com/index/introducing-gpt-5-2/

replies(2): >>46235378 #>>46241425 #
2. jstummbillig ◴[] No.46235378[source]
> For coding tasks, GPT-5.1-Codex-Max is a faster, more capable, and more token-efficient coding variant

Hm, yeah, strange. You would not be able to tell, looking at every chart on the page. Obviously not a gotcha, they put it on the page themselves after all, but how does that make sense with those benchmarks?

replies(3): >>46235489 #>>46239957 #>>46241609 #
3. tempaccount420 ◴[] No.46235489[source]
Coding requires a mindset shift that the -codex fine-tunes provide. Codex will do all kinds of weird stuff like poking in your ~/.cargo ~/go etc. to find docs and trying out code in isolation, these things definitely improve capability.
replies(1): >>46236235 #
4. dmos62 ◴[] No.46236235{3}[source]
The biggest advantage of codex variants, for me, is terseness and reduced sicophany. That, and presumably better adherence to requested output formats.
5. deaux ◴[] No.46239957[source]
Looks like they removed that line.
6. k_bx ◴[] No.46241425[source]
gpt-5.2 is already present in codex at this moment
7. baq ◴[] No.46241609[source]
Codex talks much less than the standard variant, especially between tool calls.