We've stuffed a lot of cool features into Windsurf—a super fast autocomplete model, an inline diff generation experience that feels truly native, but we're most proud of Cascade, which is an evolution of the sidebar chat experience that many other extensions have. Cascade can perform deep reasoning on your existing codebase, access a vast array of tools that allow it to run terminal commands and find relevant files, and it's omniscient of all the actions that the user has taken independent of invoking the AI. (You can for example, start implementing a change manually and just ask Cascade to "continue").
We've been using Cascade internally at Codeium on our actual production codebase, and we're getting actual value from it. We hope everyone here does too! You can find a bunch of demos of Cascade on our website but I want to show one that I made myself using Cascade to solve an interesting cryptography challenge:
Cascade was able to explain the problem to me, install some libraries needed to interact with the challenge, give me some pointers towards a solution, and implement an attack that I described to it all by itself.
the ctrl-shift-i for "inline chat - sort of", generate docstrings, control over context, I dunno, a couple small details that make it a little better.
I don't know what model they use but it's quite fast and I don't personally notice an "iq penalty" although I'm sure there is one
https://latent.space/p/enterprise
yes, they started with "another copilot", and had one of the best years in code for enterprise ai this year.
here they are starting with "another cursor".
see the pattern?
How can it tackle complex tasks independently if it is completely in sync with the user every step of the way?
The marketing copy seems to promise contradictory properties.
Today I had it build me an Angular component I could use to recognize and decode barcodes through the device camera. It did an excellent job and worked with just a little coaxing. I’m impressed.
I don't know why you must fork the remote ssh extension, please keep these basic settings. Without supporting "remote.SSH.path", I can't connect to my server using putty. (Please refer to https://github.com/MarkusDeutschmann/ssh2plink)
First, I feel the chat text area seems very small (I want to tell ai my specs, but I can't input many lines.). So, I just asked to write "folder organizer" written in go. It's an easy example, but it outperforms my expectations. https://imgur.com/a/uKDwRx6 I think it's great and I love to try it this weekend.
Cody and double are pretty good and I'll switch to them sometimes
Tops this? https://openagi.xyz/
After about an hour with Windsurf, I find myself frustrated with how it deals with context. If you add a directory to your Cascade, it's reluctant to actually read all the files in the directory.
I understand that they don't want to pay for a ton of long-context queries, but please, let users control the context, and pass the costs to the user.
It's very annoying to have the LLM try to create a file that already exists, it just didn't know about it.
Also, comments on the terminal management reflect a real issue. One solution is to expose the Cascade terminal to the user, letting the user configure the terminal in a working state, so that it has access to the correct dependencies and the PATH is properly sourced.
It gets easily confused and cannot troubleshoot or understand a bit of the environment. It is good for creating one or two html pages or something that can be done within a one or two functions.
Don't expect or depend on it for anything serious. Even for some experienced folks, it is tough to get it do somethings.
Can see how the AI is going to take over the world!
Good for learning though. Sorry for sounding arrogant, but that's the reality.
I've subscriptions for most of the tools, but it's only as good as the user. It'll and can never replace or take over unless the user doesn't want to learn or adopt or become more efficient.