Most active commenters

    ←back to thread

    Anthropic raises $13B Series F

    (www.anthropic.com)
    585 points meetpateltech | 12 comments | | HN request time: 0s | source | bottom
    Show context
    llamasushi ◴[] No.45105325[source]
    The compute moat is getting absolutely insane. We're basically at the point where you need a small country's GDP just to stay in the game for one more generation of models.

    What gets me is that this isn't even a software moat anymore - it's literally just whoever can get their hands on enough GPUs and power infrastructure. TSMC and the power companies are the real kingmakers here. You can have all the talent in the world but if you can't get 100k H100s and a dedicated power plant, you're out.

    Wonder how much of this $13B is just prepaying for compute vs actual opex. If it's mostly compute, we're watching something weird happen - like the privatization of Manhattan Project-scale infrastructure. Except instead of enriching uranium we're computing gradient descents lol

    The wildest part is we might look back at this as cheap. GPT-4 training was what, $100M? GPT-5/Opus-4 class probably $1B+? At this rate GPT-7 will need its own sovereign wealth fund

    replies(48): >>45105396 #>>45105412 #>>45105420 #>>45105480 #>>45105535 #>>45105549 #>>45105604 #>>45105619 #>>45105641 #>>45105679 #>>45105738 #>>45105766 #>>45105797 #>>45105848 #>>45105855 #>>45105915 #>>45105960 #>>45105963 #>>45105985 #>>45106070 #>>45106096 #>>45106150 #>>45106272 #>>45106285 #>>45106679 #>>45106851 #>>45106897 #>>45106940 #>>45107085 #>>45107239 #>>45107242 #>>45107347 #>>45107622 #>>45107915 #>>45108298 #>>45108477 #>>45109495 #>>45110545 #>>45110824 #>>45110882 #>>45111336 #>>45111695 #>>45111885 #>>45111904 #>>45111971 #>>45112441 #>>45112552 #>>45113827 #
    AlexandrB ◴[] No.45107239[source]
    The whole LLM era is horrible. All the innovation is coming "top-down" from very well funded companies - many of them tech incumbents, so you know the monetization is going to be awful. Since the models are expensive to run it's all subscription priced and has to run in the cloud where the user has no control. The hype is insane, and so usage is being pushed by C-suite folks who have no idea whether it's actually benefiting someone "on the ground" and decisions around which AI to use are often being made on the basis of existing vendor relationships. Basically it's the culmination of all the worst tech trends of the last 10 years.
    replies(12): >>45107334 #>>45107517 #>>45107684 #>>45107685 #>>45108349 #>>45109055 #>>45109547 #>>45109687 #>>45111383 #>>45112507 #>>45112534 #>>45114113 #
    1. dpe82 ◴[] No.45107517[source]
    In a previous generation, the enabler of all our computer tech innovation was the incredible pace of compute growth due to Moore's Law, which was also "top-down" from very well-funded companies since designing and building cutting edge chips was (and still is) very, very expensive. The hype was insane, and decisions about what chip features to build were made largely on the basis of existing vendor relationships. Those companies benefited, but so did the rest of us. History rhymes.
    replies(4): >>45107619 #>>45109790 #>>45112438 #>>45113939 #
    2. dmschulman ◴[] No.45107619[source]
    Eh, if this is true then IBM and Intel would still be the kings of the hill. Plenty of companies came from the bottom up out of nothing during the 90s and 2000s to build multi-billion dollar companies that are still dominate the market today. Many of those companies struggled for investment and grew over a long timeframe.

    The argument is something like that is not really possible anymore given the absurd upfront investments we're seeing existing AI companies need in order to further their offerings.

    replies(2): >>45107682 #>>45107904 #
    3. dpe82 ◴[] No.45107682[source]
    Anthropic has existed for a grand total of 4 years.

    But yes, there was a window of opportunity when it was possible to do cutting-edge work without billions of investment. That window of opportunity is now past, at least for LLMs. Many new technologies follow a similar pattern.

    replies(1): >>45109680 #
    4. 3uler ◴[] No.45107904[source]
    Intel was king of the hill until 2018.
    replies(1): >>45111724 #
    5. falcor84 ◴[] No.45109680{3}[source]
    What about deepseek r1? That was earlier this year - how do you know that there won't be more "deepseek moments" in the coming years?
    6. JohnMakin ◴[] No.45109790[source]
    Should probably change this to "was appearance of incredible pace of compute growth due to Moore's Law," because even my basic CS classes from 15 years ago were teaching that it was drastically slowing down, and isn't really a "law" more than an observational trend that lasted a few decades. There are limits to how small you can make transistors and we're not too far from it, at least not what would continue to yield the results of that curve.
    replies(1): >>45110936 #
    7. noosphr ◴[] No.45110936[source]
    The corollary to Moores law, that computers get twice as fast every 18 months, died by 2010. People who didn't live through the 80s, 90s and early 00s, where you'd get a computer ten times as fast every 5 years, can't imagine what it was like back then.

    Today the only way to scale compute is to throw more power at it or settle for the 5% per year real single core performance improvement.

    8. BobbyTables2 ◴[] No.45111724{3}[source]
    “Bobby, some things are like a tire fire: trying to put it out only makes it worse. You just gotta grab a beer and let it burn.”

    – Hank Rutherford Hill

    9. BrenBarn ◴[] No.45112438[source]
    The difference is once you bought one of those chips you could do your own innovation on top of it (i.e., with software) without further interference from those well-funded companies. You can't do that with GPT et al. because of the subscription model.
    replies(1): >>45112812 #
    10. almogo ◴[] No.45112812[source]
    Yes you can? Sure you can't run GPT5 locally, but get your hands on a proper GPU and you can run some still very sophisticated local inference.
    replies(1): >>45122817 #
    11. HellDunkel ◴[] No.45113939[source]
    You completly forgot about the invention of the home computer. If we would have all been loging into some mainframe computer using a home terminal your assessment would be correct.
    12. BrenBarn ◴[] No.45122817{3}[source]
    You can do some, but many of them have license restrictions that prevent you from using them in certain ways. I can buy an Intel chip and deliberately use it to do things that hurt Intel's business (e.g., start a competing company). The big AI companies are trying very hard to make that kind of thing impossible by imposing constraints on the allowed uses of their models.