Most active commenters

    ←back to thread

    The AI Investment Boom

    (www.apricitas.io)
    271 points m-hodges | 14 comments | | HN request time: 0.205s | source | bottom
    Show context
    uxhacker ◴[] No.41896287[source]
    It feels like there’s a big gap in this discussion. The focus is almost entirely on GPU and hardware investment, which is undeniably driving a lot of the current AI boom. But what’s missing is any mention of the software side and the significant VC investment going into AI-driven platforms, tools, and applications. This story could be more accurately titled ‘The GPU Investment Boom’ given how heavily it leans into the hardware conversation. Software investment deserves equal attention
    replies(5): >>41896362 #>>41896430 #>>41896473 #>>41896523 #>>41904605 #
    1. aurareturn ◴[] No.41896523[source]
    I think GPUs and datacenter is to AI is what fiber was to the dotcom boom.

    A lot of LLM based software is uneconomical because we don't have enough compute and electricity for what they're trying to do.

    replies(2): >>41896674 #>>41900192 #
    2. bee_rider ◴[] No.41896674[source]
    The actual physical fiber was useful after the companies popped though.

    GPUs are different, unless things go very poorly, these GPUs should be pretty much obsolete after 10 years.

    The ecosystem for GPGPU software and the ability to design and manufacture new GPUs might be like fiber. But that is different because it doesn’t become a useful thing at rest, it only works while Nvidia (or some successor) is still running.

    I do think that ecosystem will stick around. Whatever the next thing after AI is, I bet Nvidia has a good enough stack at this point to pivot to it. They are the vendor for these high-throughput devices: CPU vendors will never keep up with their ability to just go wider, and coders are good enough nowadays to not need the crutch of lower latency that CPUs provide (well actually we just call frameworks written by cleverer people, but borrowing smarts is a form of cleverness).

    But we do need somebody to keep releasing new versions of CUDA.

    replies(2): >>41896722 #>>41903351 #
    3. aurareturn ◴[] No.41896722[source]
    But computer chips have always had limited usefulness because newer chips are simply faster and more efficient. The datacenter build outs and increase in electricity capacity will always be useful.
    4. jillesvangurp ◴[] No.41900192[source]
    I think a better analogy is the valuation of Intel vs. that of Microsoft. For a long time, Intel dominated the CPU market. So you'd expect them to be the most valuable company. Instead, a small startup called Microsoft started dominating the software market and eventually became the most valuable company on the planet. The combined value of the software market is orders of magnitudes larger than that of all chip makers combined and has been for quite some time. The only reason people buy hardware is to use software.

    The same is going to happen with AI. Yes, Nvidia and their competitors are going to do well. But most of the value will be in software ultimately.

    GPUs and data centers are just infrastructure. Same for the electricity generation needed to power all that. The demand for AI is causing there to be a lot of demand for that stuff. And that's driving the cost of all of it down. The cheapest way to add power generation is wind and solar. And both are dominating new power generation addition. Chip manufacturers are very busy making better, cheaper, faster etc. chips. They are getting better rapidly. It's hard to see how NVidia can dominate this market indefinitely.

    AI is going to be very economical long term. Cheap chips. Cheap power. Lots of value. That's why all the software companies are busy ramping up their infrastructure. IMHO investing in expensive nuclear projects is a bit desperate. But I can see the logic of not wanting to fall behind for the likes of Amazon, Google, MS, Apple, etc. They can sure afford to lose some billions and it's probably more important to them to have the power available quickly than to get it cheaply.

    replies(4): >>41900408 #>>41900419 #>>41901103 #>>41902728 #
    5. datavirtue ◴[] No.41900408[source]
    A lot of it is green washing. Some of it might be signals to the competition or other vendors. This AI buildup is going to be a knock-down-drag-out.
    6. p1esk ◴[] No.41900419[source]
    Nvidia is a software company that also happens to make decent hardware. People buy their hardware because of their software. And it’s not just CUDA. Nvidia is building a whole bunch of potentially groundbreaking software products. Listen to the last GTC keynote to learn more about it.
    7. Slartie ◴[] No.41901103[source]
    Nuclear power is probably the least valid choice of all the power source choices when the goal is to "have power available quickly".

    There is no kind of power plant that takes longer to build than a nuclear plant.

    8. throw234234234 ◴[] No.41902728[source]
    > But most of the value will be in software ultimately.

    Isn't one of the points of AI to make democratize the act of writing software? AI isn't like other software inventions which make a product from someone's intelligence - long term its providing the raw intelligence itself. I mean we have NVDA's CEO saying to not learn to code, and lot of non-techies quoting him these days.

    If this is true the end effect is to destroy all value moats in the software layer from an economic perspective. Software just becomes a cheap tool which enables mostly other industries.

    So if there isn't long term value in the hardware (as you are pointing out), and there isn't long term value in the software due to no barriers of entry - where does the value of all of this economic efficiency improvement accrue to?

    I suspect large old stale corporations with large work forces and moats outside of technology (i.e. physical and/or social moats) not threatened by AI, who can empower their management class by replacing skilled (e.g software dev's, accountants, etc) and semi-skilled labor (e.g call centre operators) with AI. The decision makers in privileged positions behind these moats, rather than the do'ers will win out.

    replies(1): >>41905217 #
    9. Wheatman ◴[] No.41903351[source]
    >GPUs are different, unless things go very poorly, these GPUs should be pretty much obsolete after 10 years.

    Not really much for gaming, especially here in third world cou tries, OLD or Abandoned Gpu's are basically all that is avalaible for use, from anything from gaming to even video editing.

    Considering how many new great games are being made(and with the news of nvidia drivers possibly becoming easily avalaible on linux.), and with tech becoming more avalaible and usefull on these places, i expect there to be a somewhat considerable increase in demand for GPU in say here in africa or southeast asia.

    It probably wont change the world or us economy, but it would pribably make me quite happy if the bubble were to burst, even as a supporter of AI in research and Cancer detection.

    replies(1): >>41904080 #
    10. johnnyanmac ◴[] No.41904080{3}[source]
    Not really sure if it'd do much. Old GPUs meant you're playing old games. That was fine in a time where consoles stalled the minimum spec market for 8+ years, but this decade left that behind. I imagine that very few high or even mid end gmaes made in 2030 would really be functional on any 2020 hardware.

    So the games that work are probably out of support anyway. So there's no money being generated for anyone.

    replies(2): >>41904594 #>>41904957 #
    11. Wheatman ◴[] No.41904594{4}[source]
    True, hardware requirements tend to increase, but what says we arent reqching another plateau already, especially with the newest consoles only recently being released, not to mention how these data centers tend to run the latest GPU'S, so depending on when the bubble bursts(Im guessing aroudn 2026, 2027 or later due to the current USA election, where most of these data centers are located), it wouldnt be off to say that a cutting edge RTX9999 gpu from 2026 can run a 2032 game on medium or maybe hugh settings quite well.

    Im more than happy to play 2010 to 2015 ganes right now at low settings, it would be even better to play games that are 5 years away rather than 10.

    The same can be said for rendering, professional work, and server making, something is better than nothing, and most computers here dont even have a seperate gpu and opt for integrated graphics.

    12. Kurtz79 ◴[] No.41904957{4}[source]
    I’m not sure how this decade is any different than the one that preceded it?

    The current console generation is 4 years old and it’s at mid-cycle at best.

    Games running on modern consoles are visually marginally better than those in the previous generation, and AAA titles are so expensive to develop that consoles will still be the target HW.

    I really could not be bothered in updating my 3080…

    Have I missed a new “Crysis”?

    replies(1): >>41905345 #
    13. n_ary ◴[] No.41905217{3}[source]
    > I mean we have NVDA's CEO saying to not learn to code, and lot of non-techies quoting him these days.

    Simply planting the seed of ignorance for generations to come. If people do not learn, they need someone/something to produce this, and who else is better than the gold mine(AI) to supply you these knowledge? Also, as long as cryptocurrency and AI boom goes, shovel sellers(i.e. NVDA) gains to profit, so it is in their best interest to run the sales pitch.

    Also, once people think that all is gone, future is bleak, people will not learn and generate novel ideas and innovations, so all knowledge, research and innovation will slowly get locked away behind paywalls of people who can afford select few with the knowledge and access to wield the AI tech. Think of the internet of our gone years minus all the open and free knowledge, all the OSS, all the passionate people contributing and sharing. Now replace that with all the course sites where you must pay to get access to anything decent and replace the courses with AI.

    At best, I see all these as feeding the fear and the laziness to kill off the expensive knowledge and the sharing culture, because if that is achieved, AI is the next de-facto product you need to build automation and digitalization.

    14. johnnyanmac ◴[] No.41905345{5}[source]
    >I’m not sure how this decade is any different than the one that preceded it?

    2010, your game has to run on a PS3/Xbox 360. That didn't matter for PC games because all 3 had different architectures. So they were more or less parallel development.

    2015, Playstation and Xbox both converged to X86. Porting between platforms is much easier and unified in many ways. But the big "mistake" (or benefit to your case) is that the PS4/XBO did not really try to "future proof" the way consoles usually did. A 2013 $4-500 PC build could run games about as well as a console. From here PCs would only grow.

    2020. The PS5/XBX come out at the very end, so games are still more or less stuck with PS4/XBO as a "minium spec", but PCs have advanced a lot. SSDs became standard, we have tech like DLSS and Ray Traced rendering emerging from hardware, 60fps is being more normalized. RAM standards are starting to shift to 16GB over 8. But... Your minimum spec can't use these, so we still need to target 2013 tech. Despite the "pro versions" releasing, most games stlll ran adequately on the base models. Just not 60fps nor over 720p internal rendering.

    Now come 2025. Playstation barely tapped into the base' power and is instead releasing a pro model already. Instead of optimizations, Sony wants to throw more hardware at the problem. The Xbox Series S should have in theory limited the minimum spec. But we have several high profile titles opting around that requirement.

    The difference is happening in real time. There's more and more a trend to NOT optimize their tech as well (or at least, push their minimum spec to a point where the base models are onl lightly considered. A la Launch Cyberpunk), and all this will push up specs quite a bit in the PC market as a result. The console market always influences how PCs are targeted. And the console market in Gen 9 seems to be taking a lot less care for the low spec than Gen 8. That worries me from a "they'll support 10 year old hardware" POV.

    >Have I missed a new “Crysis”?

    If anything, Cyberpunk was the anti-crysis in many ways. Kind of showing how we're past the "current gen" back then, but also showing how they so haphazardly disregarded older platforms for lack of proper development time/care. Not becsuse the game was "ahead of its time". It's not like the PS5 performance was amazing to begin with. Just passable.

    Specs are going up, but not for the right reasons IMO. I blame the 4k marketing for a good part of this as opposed to focusing on utilizing the huge jump in hardware for more game features, but that's for another rant.