←back to thread

GPT-5.2

(openai.com)
1019 points atgctg | 1 comments | | HN request time: 0s | source
Show context
SkyPuncher ◴[] No.46235977[source]
Given the price increase and speculation that GPT 5 is a MoE model, I'm wondering if they're simply "turning up the good stuff" without making significant changes under the hood.
replies(2): >>46235986 #>>46236012 #
minimaxir ◴[] No.46236012[source]
I'm not sure why being a MoE model would allow OpenAI to "turn up the good stuff". You can't just increase the number of E without training it as such.
replies(2): >>46236953 #>>46236981 #
1. SkyPuncher ◴[] No.46236981[source]
My opinion is they're trying to internally route requests to cheaper experts when they think they can get away with it. I felt this was evident by the wild inconsistencies I'd experience using it for coding. Both in quality and latency

You "turn of the good stuff" by eliminating or reducing the likelihood of the cheap experts handling the request.