←back to thread

The AI Investment Boom

(www.apricitas.io)
271 points m-hodges | 1 comments | | HN request time: 0.201s | source
Show context
uxhacker ◴[] No.41896287[source]
It feels like there’s a big gap in this discussion. The focus is almost entirely on GPU and hardware investment, which is undeniably driving a lot of the current AI boom. But what’s missing is any mention of the software side and the significant VC investment going into AI-driven platforms, tools, and applications. This story could be more accurately titled ‘The GPU Investment Boom’ given how heavily it leans into the hardware conversation. Software investment deserves equal attention
replies(5): >>41896362 #>>41896430 #>>41896473 #>>41896523 #>>41904605 #
aurareturn ◴[] No.41896523[source]
I think GPUs and datacenter is to AI is what fiber was to the dotcom boom.

A lot of LLM based software is uneconomical because we don't have enough compute and electricity for what they're trying to do.

replies(2): >>41896674 #>>41900192 #
bee_rider ◴[] No.41896674[source]
The actual physical fiber was useful after the companies popped though.

GPUs are different, unless things go very poorly, these GPUs should be pretty much obsolete after 10 years.

The ecosystem for GPGPU software and the ability to design and manufacture new GPUs might be like fiber. But that is different because it doesn’t become a useful thing at rest, it only works while Nvidia (or some successor) is still running.

I do think that ecosystem will stick around. Whatever the next thing after AI is, I bet Nvidia has a good enough stack at this point to pivot to it. They are the vendor for these high-throughput devices: CPU vendors will never keep up with their ability to just go wider, and coders are good enough nowadays to not need the crutch of lower latency that CPUs provide (well actually we just call frameworks written by cleverer people, but borrowing smarts is a form of cleverness).

But we do need somebody to keep releasing new versions of CUDA.

replies(2): >>41896722 #>>41903351 #
Wheatman ◴[] No.41903351[source]
>GPUs are different, unless things go very poorly, these GPUs should be pretty much obsolete after 10 years.

Not really much for gaming, especially here in third world cou tries, OLD or Abandoned Gpu's are basically all that is avalaible for use, from anything from gaming to even video editing.

Considering how many new great games are being made(and with the news of nvidia drivers possibly becoming easily avalaible on linux.), and with tech becoming more avalaible and usefull on these places, i expect there to be a somewhat considerable increase in demand for GPU in say here in africa or southeast asia.

It probably wont change the world or us economy, but it would pribably make me quite happy if the bubble were to burst, even as a supporter of AI in research and Cancer detection.

replies(1): >>41904080 #
johnnyanmac ◴[] No.41904080[source]
Not really sure if it'd do much. Old GPUs meant you're playing old games. That was fine in a time where consoles stalled the minimum spec market for 8+ years, but this decade left that behind. I imagine that very few high or even mid end gmaes made in 2030 would really be functional on any 2020 hardware.

So the games that work are probably out of support anyway. So there's no money being generated for anyone.

replies(2): >>41904594 #>>41904957 #
1. Wheatman ◴[] No.41904594[source]
True, hardware requirements tend to increase, but what says we arent reqching another plateau already, especially with the newest consoles only recently being released, not to mention how these data centers tend to run the latest GPU'S, so depending on when the bubble bursts(Im guessing aroudn 2026, 2027 or later due to the current USA election, where most of these data centers are located), it wouldnt be off to say that a cutting edge RTX9999 gpu from 2026 can run a 2032 game on medium or maybe hugh settings quite well.

Im more than happy to play 2010 to 2015 ganes right now at low settings, it would be even better to play games that are 5 years away rather than 10.

The same can be said for rendering, professional work, and server making, something is better than nothing, and most computers here dont even have a seperate gpu and opt for integrated graphics.