←back to thread

Anthropic raises $13B Series F

(www.anthropic.com)
585 points meetpateltech | 1 comments | | HN request time: 0s | source
Show context
llamasushi ◴[] No.45105325[source]
The compute moat is getting absolutely insane. We're basically at the point where you need a small country's GDP just to stay in the game for one more generation of models.

What gets me is that this isn't even a software moat anymore - it's literally just whoever can get their hands on enough GPUs and power infrastructure. TSMC and the power companies are the real kingmakers here. You can have all the talent in the world but if you can't get 100k H100s and a dedicated power plant, you're out.

Wonder how much of this $13B is just prepaying for compute vs actual opex. If it's mostly compute, we're watching something weird happen - like the privatization of Manhattan Project-scale infrastructure. Except instead of enriching uranium we're computing gradient descents lol

The wildest part is we might look back at this as cheap. GPT-4 training was what, $100M? GPT-5/Opus-4 class probably $1B+? At this rate GPT-7 will need its own sovereign wealth fund

replies(48): >>45105396 #>>45105412 #>>45105420 #>>45105480 #>>45105535 #>>45105549 #>>45105604 #>>45105619 #>>45105641 #>>45105679 #>>45105738 #>>45105766 #>>45105797 #>>45105848 #>>45105855 #>>45105915 #>>45105960 #>>45105963 #>>45105985 #>>45106070 #>>45106096 #>>45106150 #>>45106272 #>>45106285 #>>45106679 #>>45106851 #>>45106897 #>>45106940 #>>45107085 #>>45107239 #>>45107242 #>>45107347 #>>45107622 #>>45107915 #>>45108298 #>>45108477 #>>45109495 #>>45110545 #>>45110824 #>>45110882 #>>45111336 #>>45111695 #>>45111885 #>>45111904 #>>45111971 #>>45112441 #>>45112552 #>>45113827 #
AlexandrB ◴[] No.45107239[source]
The whole LLM era is horrible. All the innovation is coming "top-down" from very well funded companies - many of them tech incumbents, so you know the monetization is going to be awful. Since the models are expensive to run it's all subscription priced and has to run in the cloud where the user has no control. The hype is insane, and so usage is being pushed by C-suite folks who have no idea whether it's actually benefiting someone "on the ground" and decisions around which AI to use are often being made on the basis of existing vendor relationships. Basically it's the culmination of all the worst tech trends of the last 10 years.
replies(12): >>45107334 #>>45107517 #>>45107684 #>>45107685 #>>45108349 #>>45109055 #>>45109547 #>>45109687 #>>45111383 #>>45112507 #>>45112534 #>>45114113 #
dpe82 ◴[] No.45107517[source]
In a previous generation, the enabler of all our computer tech innovation was the incredible pace of compute growth due to Moore's Law, which was also "top-down" from very well-funded companies since designing and building cutting edge chips was (and still is) very, very expensive. The hype was insane, and decisions about what chip features to build were made largely on the basis of existing vendor relationships. Those companies benefited, but so did the rest of us. History rhymes.
replies(4): >>45107619 #>>45109790 #>>45112438 #>>45113939 #
JohnMakin ◴[] No.45109790[source]
Should probably change this to "was appearance of incredible pace of compute growth due to Moore's Law," because even my basic CS classes from 15 years ago were teaching that it was drastically slowing down, and isn't really a "law" more than an observational trend that lasted a few decades. There are limits to how small you can make transistors and we're not too far from it, at least not what would continue to yield the results of that curve.
replies(1): >>45110936 #
1. noosphr ◴[] No.45110936{3}[source]
The corollary to Moores law, that computers get twice as fast every 18 months, died by 2010. People who didn't live through the 80s, 90s and early 00s, where you'd get a computer ten times as fast every 5 years, can't imagine what it was like back then.

Today the only way to scale compute is to throw more power at it or settle for the 5% per year real single core performance improvement.