←back to thread

858 points cryptophreak | 2 comments | | HN request time: 0.001s | source
Show context
themanmaran ◴[] No.42935503[source]
I'm surprised that the article (and comments) haven't mentioned Cursor.

Agreed that copy pasting context in and out of ChatGPT isn't the fastest workflow. But Cursor has been a major speed up in the way I write code. And it's primarily through a chat interface, but with a few QOL hacks that make it way faster:

1. Output gets applied to your file in a git-diff style. So you can approve/deny changes.

2. It (kinda) has context of your codebase so you don't have to specify as much. Though it works best when you explicitly tag files ("Use the utils from @src/utils/currency.ts")

3. Directly inserting terminal logs or type errors into the chat interface is incredibly convenient. Just hover over the error and click the "add to chat"

replies(8): >>42935579 #>>42935604 #>>42935621 #>>42935766 #>>42935845 #>>42937616 #>>42938713 #>>42939579 #
1. mholm ◴[] No.42935621[source]
Yeah, the OP has a great idea, but models as-is can't handle that kind of workflow reliably. The article is both a year behind, and a year ahead at the same time. The user must iterate with the chatbot, and you can't do that by just doing a top down 'here's a list of all features, get going, ping me when finished' prompt. AI is a junior engineer, so you have to treat it like a junior engineer, and that means looking through your chat logs, and perhaps backing up to a restore point and going a different direction.
replies(1): >>42935833 #
2. mttrms ◴[] No.42935833[source]
I've started using Zed on a side project and I really appreciate that you can easily manipulate the chat / context and continue making requests

https://zed.dev/docs/assistant/assistant-panel#editing-a-con...

It's still a "chat" but it's just text at the end of the day. So you can edit as you see fit to refine your context and get better responses.