2 points rule2025 | 3 comments | | HN request time: 1.585s | source

The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.
1. esafak ◴[] No.45624749[source]
Sonnet and Gemini are good and fast. Can't speak for Grok.
2. moomoo11 ◴[] No.45625588[source]
It seems to work with less issues than CC opus.

I don’t mind if it takes longer as long as the answer is correct more often.

You can always be doing more work while one chat is working..