- Apollo program: 4%
- Railroads: 6% (mentioned by the author)
- Covid stimulus: 27%
- WW2 defense: 40%
- Apollo program: 4%
- Railroads: 6% (mentioned by the author)
- Covid stimulus: 27%
- WW2 defense: 40%
- 40% of long-distance ton miles travel by rail in the US. This represents a VAST part of the economic activity within the country.
- A literal plague, and the cessation of much economic activity, with the goal of avoiding a total collapse.
- ...Come on.
So we're comparing these earth-shaking changes and reactions to crisis with "AI"? Other than the people selling pickaxes and hookers to the prospectors, who is getting rich here exactly? What essential economic activity is AI crucial to? What war is it fighting? It mostly seems to be a toy that costs FAR more than it could ever hope to make, subsidized by some obscenely wealthy speculators, executives fantasizing about savings that don't materialize, and a product looking for a purpose commensurate to the resources it eats.
The continued devaluing of skilled labor and making smaller pools of workers able to produce at higher levels, if not their automation entirely.
And yeah AI generated code blows. It's verbose and inefficient. So what? The state of mainstream platform web development has been a complete shit show since roughly 2010. Websites for a decade plus just... don't load sometimes. Links don't load right, you get in a never-ending spinning loading wheel, stuff just doesn't render or it crashes the browser tab entirely. That's been status quo for Facebook, YouTube, Instagram, fuck knows how many desktop apps which are just web wrappers around websites, for just.. like I said, over a decade at this point. Nobody even bats an eye.
I don't see how ChatGPT generating all the code is going to make anything substantively worse than hundreds of junior devs educated at StackOverflow university with zero oversight already have.
Literally every profession around me is radically changing due to AI. Legal, tech, marketing etc have adopted AI faster than any technology I have ever witnessed.
I’m gobsmacked you’re in denial.
But then we saw the same thing with Crypto, tons of money poured into that, the Metaverse was going to be the next big thing! People who didn't see and accept that must not understand the obvious appeal...
Each of the 15 charts would have been a page of boilerplate + Python, and frankly there was a huge amount of interdisciplinary work that went into the hundreds of thought steps in the deep reasoning model. It would have taken days to fill in the gaps and finish the analysis. The new crop of deep reasoning models that can do iteration is powerful.
The gap between previous "scratch work" of poking around a spreadsheet, and getting pages of advanced data analytics tabula rasa, is a gap so large I almost don't have words for it. It often seems larger than the gap between pen and paper and a computer.
And then later, off of work, I wanted to show real average post-inflation returns for housing areas that gentrify and compare it with non-gentrifying areas. Within a minute all of the hard data was pulled in and summed up. It then codes up a graph for the typical "shape of gentrification", which I didn't even need to clarify to get a good answer. Again, this is as large a jump as moving from an encyclopedia to an internet search engine.
I know it's used all over finance though. At Jane Street (upper echelon proprietary trading) they have it baked into their code development in multiple layers. In actual useful ways, not "auto completion" like mass market tools. Well it is integrated into the editor and can generate code, but there is also AI that screens all of the code that is submitted, and an AI "director" tracks all of the code changes from all of the developers, so if a program starts failing an edge case that wasn't apparent earlier, the director will be able to reverse engineer all of the code commits, find out where the dev went wrong, and explain it.
Then that data generated from all of the engineers and AI agents is fed back into in-house AI model training, which then feeds back into improvements in the systems above.
All of the dismissiveness reminds me of the early days of the internet. On that note, this suite of technologies seems large. Somewhere in-between the introduction of the digital office suite (word/excel/etc) and perhaps the Internet itself. In some respects, when it comes to the iterative nature of it all (which often degrades to noise if mindlessly fed back into itself, but in time will be honed to, say, test thousands of changes to an engineering Digital Twin) it seems like something that may be more powerful than both.
I refuse to believe this will not have long term consequences.
I WISH that after this, companies will put up quality guardrails to basically offer the same product 60% cheaper at better quality, but I don't trust companies.