←back to thread

GPT-5.2

(openai.com)
1019 points atgctg | 4 comments | | HN request time: 0.001s | source
Show context
sfmike ◴[] No.46234974[source]
Everything is still based on 4 4o still right? is a new model training just too expensive? They can consult deepseek team maybe for cost constrained new models.
replies(4): >>46235000 #>>46235052 #>>46235127 #>>46235143 #
1. elgatolopez ◴[] No.46235127[source]
Where did you get that from? Cutoff date says august 2025. Looks like a newly pretrained model
replies(2): >>46235406 #>>46235471 #
2. SparkyMcUnicorn ◴[] No.46235406[source]
If the pretraining rumors are true, they're probably using continued pretraining on the older weights. Right?
replies(1): >>46236213 #
3. FergusArgyll ◴[] No.46235471[source]
> This stands in sharp contrast to rivals: OpenAI’s leading researchers have not completed a successful full-scale pre-training run that was broadly deployed for a new frontier model since GPT-4o in May 2024, highlighting the significant technical hurdle that Google’s TPU fleet has managed to overcome.

- https://newsletter.semianalysis.com/p/tpuv7-google-takes-a-s...

It's also plainly obvious from using it. The "Broadly deployed" qualifier is presumably referring to 4.5

4. ◴[] No.46236213[source]