←back to thread

GPT-5.2

(openai.com)
1019 points atgctg | 1 comments | | HN request time: 0.242s | source
Show context
sfmike ◴[] No.46234974[source]
Everything is still based on 4 4o still right? is a new model training just too expensive? They can consult deepseek team maybe for cost constrained new models.
replies(4): >>46235000 #>>46235052 #>>46235127 #>>46235143 #
verdverm ◴[] No.46235000[source]
Apparently they have not had a successful pre training run in 1.5 years
replies(2): >>46235068 #>>46235299 #
fouronnes3 ◴[] No.46235068[source]
I want to read a short scify story set in 2150 about how, mysteriously, no one has been able to train a better LLM for 125 years. The binary weights are studied with unbelievably advanced quantum computers but no one can really train a new AI from scratch. This starts cults, wars and legends and ultimately (by the third book) leads to the main protagonist learning to code by hand, something that no human left alive still knows how to do. Could this be the secret to making a new AI from scratch, more than a century later?
replies(6): >>46235128 #>>46235237 #>>46235306 #>>46235386 #>>46235429 #>>46235455 #
1. barrenko ◴[] No.46235237[source]
Monsieur, if I may offer a vaaaguely similar story on how things may progress https://www.owlposting.com/p/a-body-most-amenable-to-experim...