←back to thread

The AI Investment Boom

(www.apricitas.io)
271 points m-hodges | 5 comments | | HN request time: 0.83s | source
Show context
uxhacker ◴[] No.41896287[source]
It feels like there’s a big gap in this discussion. The focus is almost entirely on GPU and hardware investment, which is undeniably driving a lot of the current AI boom. But what’s missing is any mention of the software side and the significant VC investment going into AI-driven platforms, tools, and applications. This story could be more accurately titled ‘The GPU Investment Boom’ given how heavily it leans into the hardware conversation. Software investment deserves equal attention
replies(5): >>41896362 #>>41896430 #>>41896473 #>>41896523 #>>41904605 #
aurareturn ◴[] No.41896523[source]
I think GPUs and datacenter is to AI is what fiber was to the dotcom boom.

A lot of LLM based software is uneconomical because we don't have enough compute and electricity for what they're trying to do.

replies(2): >>41896674 #>>41900192 #
bee_rider ◴[] No.41896674[source]
The actual physical fiber was useful after the companies popped though.

GPUs are different, unless things go very poorly, these GPUs should be pretty much obsolete after 10 years.

The ecosystem for GPGPU software and the ability to design and manufacture new GPUs might be like fiber. But that is different because it doesn’t become a useful thing at rest, it only works while Nvidia (or some successor) is still running.

I do think that ecosystem will stick around. Whatever the next thing after AI is, I bet Nvidia has a good enough stack at this point to pivot to it. They are the vendor for these high-throughput devices: CPU vendors will never keep up with their ability to just go wider, and coders are good enough nowadays to not need the crutch of lower latency that CPUs provide (well actually we just call frameworks written by cleverer people, but borrowing smarts is a form of cleverness).

But we do need somebody to keep releasing new versions of CUDA.

replies(2): >>41896722 #>>41903351 #
1. Wheatman ◴[] No.41903351[source]
>GPUs are different, unless things go very poorly, these GPUs should be pretty much obsolete after 10 years.

Not really much for gaming, especially here in third world cou tries, OLD or Abandoned Gpu's are basically all that is avalaible for use, from anything from gaming to even video editing.

Considering how many new great games are being made(and with the news of nvidia drivers possibly becoming easily avalaible on linux.), and with tech becoming more avalaible and usefull on these places, i expect there to be a somewhat considerable increase in demand for GPU in say here in africa or southeast asia.

It probably wont change the world or us economy, but it would pribably make me quite happy if the bubble were to burst, even as a supporter of AI in research and Cancer detection.

replies(1): >>41904080 #
2. johnnyanmac ◴[] No.41904080[source]
Not really sure if it'd do much. Old GPUs meant you're playing old games. That was fine in a time where consoles stalled the minimum spec market for 8+ years, but this decade left that behind. I imagine that very few high or even mid end gmaes made in 2030 would really be functional on any 2020 hardware.

So the games that work are probably out of support anyway. So there's no money being generated for anyone.

replies(2): >>41904594 #>>41904957 #
3. Wheatman ◴[] No.41904594[source]
True, hardware requirements tend to increase, but what says we arent reqching another plateau already, especially with the newest consoles only recently being released, not to mention how these data centers tend to run the latest GPU'S, so depending on when the bubble bursts(Im guessing aroudn 2026, 2027 or later due to the current USA election, where most of these data centers are located), it wouldnt be off to say that a cutting edge RTX9999 gpu from 2026 can run a 2032 game on medium or maybe hugh settings quite well.

Im more than happy to play 2010 to 2015 ganes right now at low settings, it would be even better to play games that are 5 years away rather than 10.

The same can be said for rendering, professional work, and server making, something is better than nothing, and most computers here dont even have a seperate gpu and opt for integrated graphics.

4. Kurtz79 ◴[] No.41904957[source]
I’m not sure how this decade is any different than the one that preceded it?

The current console generation is 4 years old and it’s at mid-cycle at best.

Games running on modern consoles are visually marginally better than those in the previous generation, and AAA titles are so expensive to develop that consoles will still be the target HW.

I really could not be bothered in updating my 3080…

Have I missed a new “Crysis”?

replies(1): >>41905345 #
5. johnnyanmac ◴[] No.41905345{3}[source]
>I’m not sure how this decade is any different than the one that preceded it?

2010, your game has to run on a PS3/Xbox 360. That didn't matter for PC games because all 3 had different architectures. So they were more or less parallel development.

2015, Playstation and Xbox both converged to X86. Porting between platforms is much easier and unified in many ways. But the big "mistake" (or benefit to your case) is that the PS4/XBO did not really try to "future proof" the way consoles usually did. A 2013 $4-500 PC build could run games about as well as a console. From here PCs would only grow.

2020. The PS5/XBX come out at the very end, so games are still more or less stuck with PS4/XBO as a "minium spec", but PCs have advanced a lot. SSDs became standard, we have tech like DLSS and Ray Traced rendering emerging from hardware, 60fps is being more normalized. RAM standards are starting to shift to 16GB over 8. But... Your minimum spec can't use these, so we still need to target 2013 tech. Despite the "pro versions" releasing, most games stlll ran adequately on the base models. Just not 60fps nor over 720p internal rendering.

Now come 2025. Playstation barely tapped into the base' power and is instead releasing a pro model already. Instead of optimizations, Sony wants to throw more hardware at the problem. The Xbox Series S should have in theory limited the minimum spec. But we have several high profile titles opting around that requirement.

The difference is happening in real time. There's more and more a trend to NOT optimize their tech as well (or at least, push their minimum spec to a point where the base models are onl lightly considered. A la Launch Cyberpunk), and all this will push up specs quite a bit in the PC market as a result. The console market always influences how PCs are targeted. And the console market in Gen 9 seems to be taking a lot less care for the low spec than Gen 8. That worries me from a "they'll support 10 year old hardware" POV.

>Have I missed a new “Crysis”?

If anything, Cyberpunk was the anti-crysis in many ways. Kind of showing how we're past the "current gen" back then, but also showing how they so haphazardly disregarded older platforms for lack of proper development time/care. Not becsuse the game was "ahead of its time". It's not like the PS5 performance was amazing to begin with. Just passable.

Specs are going up, but not for the right reasons IMO. I blame the 4k marketing for a good part of this as opposed to focusing on utilizing the huge jump in hardware for more game features, but that's for another rant.