I might be doing llm wrong, but i just can't get how people might actually do something not trivial just by vibe coding. And it's not like i'm an old fart either, i'm a university student
I might be doing llm wrong, but i just can't get how people might actually do something not trivial just by vibe coding. And it's not like i'm an old fart either, i'm a university student
This is an article that describes a pretty good approach for that: https://getstream.io/blog/cursor-ai-large-projects/
But do skip (or at least significantly postpone) enabling the 'yolo mode' (sigh).
Then, I absolutely love being aided by llms for my day to day tasks. I'm much more efficient when studying and they can be a game changer when you're stuck and you don't know how to proceed. You can discuss different implementation ideas as if you had a colleague, perhaps not a PhD smart one but still someone with a quite deep knowledge of everything
But, it's no miracle. That's the issue I have with the way the idea of Ai is sold to the c suites and the general public
All I can say to this is fucking good!
Lets imagine we got AGI at the start of 2022. I'm talking about human level+ as good as you coding and reasoning AI that works well on the hardware from that age.
What would the world look like today? Would you still have your job. With the world be in total disarray? Would unethical companies quickly fire most their staff and replace them with machines? Would their be mass riots in the streets by starving neo-luddites? Would automated drones be shooting at them?
Simply put people and our social systems are not ready for competent machine intelligence and how fast it will change the world. We should feel lucky we are getting a ramp up period, and hopefully one that draws out a while longer.