LLMs are more tech demo than product right now, and it could take many years for their full impact to become apparent.
LLMs are more tech demo than product right now, and it could take many years for their full impact to become apparent.
undoubtedly.
The economic impact of some actually useful tools (Cursor, Claude) are propping up hundreds of billions of dollars in funding for, idk, "AI for <pick an industry> "or "replace your <job title> with our AI tool"
> I think we will be there in three to six months, where AI is writing 90% of the code. And then, in 12 months, we may be in a world where AI is writing essentially all of the code
https://www.businessinsider.com/anthropic-ceo-ai-90-percent-...
This seems either wildly optimistic or comes with a giant asterisk that AI will write it by token predicting, then a human will have to double check and refine it.
Again. We had some parts of one of 3 datasets split in ~40 files, and we had to manipulate and save them before doing anything else. A colleague asked chatgpt to write the code to do it and it was single threaded, and not feasible. I hopped up on htop and upon seeing it was using only one core, I suggested her to ask chatgpt to make the conversion run on different files in different threads, and we basically went from absolutely slow to quite fast. But that supposed that the person using the code knows what's going on, why, and what is not going on. And when it is possible to do something different. Using it without asking yourself more about the context is a terrible use imho, but it's absolutely the direction that I see we're headed towards and I'm not a fan of it
I saw people with trouble manipulating boolean tables of 3 variables in their head trying to generate complete web applications, it will work for linear duties (input -> processing -> storage) but I highly doubt they will be able to understand anything with 2nd order effects
No one really knows exactly how AI will play out. There's a lot of motivated reasoning going on, both from hypesters and cynics.
Microsoft alone has half a trillion dollars in assets, and Apple/Google/Meta/Amazon are in similar financial positions. Spending a few tens of billions on datacenters is, as crazy as it sounds, nothing to them.
That said, private companies are pumping alot of money into this space, but technological progress is a peaks and valley situation. I imagine most of the money will ultimately move the needle very little following things of dubious hindsight value
We don't have a choice but to pay attention to what CEOs say, they are the ones that might lead our companies off of cliffs if we let them
I won't deny that LLMs can be useful--I still use them--but in my experience an LLM's success rate in writing working code is somewhere around 50%. That leads to a productivity boost that, while not negative, isn't anywhere near the wild numbers that are bandied about.
To be fair, 3 booleans (2^3=8) is more working memory than most people are productive with. Way more if they’re nullable :)