←back to thread

504 points Terretta | 3 comments | | HN request time: 0s | source
Show context
bearjaws[dead post] ◴[] No.45064588[source]
[flagged]
1. gs17 ◴[] No.45065423[source]
Yeah, I tried it in Copilot and it's fast, but I'd rather have a 2x smarter model that takes 10x longer. The competition for "fast" is the existing autocomplete model, not the chat models.
replies(1): >>45065721 #
2. dmix ◴[] No.45065721[source]
Why wouldn't you want the option for both?

I haven't used Copilot in a while but Cursor lets you easily switch the model depending on what you're trying to do.

Having options for thinking, normal, fast covers every sort of problem. GPT-5 doesn't let you choose which IMO is only helpful for non-IDE type integrations, although even in ChatGPT it can be annoying to get "thinking" constantly for simple questions.

replies(1): >>45066792 #
3. gs17 ◴[] No.45066792[source]
I have the option for either, but it's an option I'll never choose. My issue with Copilot wasn't speed, it's quality. The only thing that has to be fast is the text-completion part, which Grok isn't replacing. The code chat/agent part needs to focus on actually being able to do things.