Appreciate all the takes so far, the team is reading this thread for feedback. Feel free to pile on with bugs or feature requests we'll all be reading.
Appreciate all the takes so far, the team is reading this thread for feedback. Feel free to pile on with bugs or feature requests we'll all be reading.
like to just get a short response - for simple things like "what's a nm and grep command to find this symbol in these 3 folders". I use gemini alot for this type of thing already
Or would that have to be a custom prompt I write?
other people use simon willison's `llm` tool https://github.com/simonw/llm
Both allow you to switch between models, send short prompts from a CLI, optionally attach some context. I prefer mods because it's an easier install and I never need to worry about Python envs and other insanity.
There's also wrappers that place the command directly in your terminal prompt if you run shelp-c