←back to thread

Anthropic raises $13B Series F

(www.anthropic.com)
585 points meetpateltech | 3 comments | | HN request time: 0s | source
Show context
llamasushi ◴[] No.45105325[source]
The compute moat is getting absolutely insane. We're basically at the point where you need a small country's GDP just to stay in the game for one more generation of models.

What gets me is that this isn't even a software moat anymore - it's literally just whoever can get their hands on enough GPUs and power infrastructure. TSMC and the power companies are the real kingmakers here. You can have all the talent in the world but if you can't get 100k H100s and a dedicated power plant, you're out.

Wonder how much of this $13B is just prepaying for compute vs actual opex. If it's mostly compute, we're watching something weird happen - like the privatization of Manhattan Project-scale infrastructure. Except instead of enriching uranium we're computing gradient descents lol

The wildest part is we might look back at this as cheap. GPT-4 training was what, $100M? GPT-5/Opus-4 class probably $1B+? At this rate GPT-7 will need its own sovereign wealth fund

replies(48): >>45105396 #>>45105412 #>>45105420 #>>45105480 #>>45105535 #>>45105549 #>>45105604 #>>45105619 #>>45105641 #>>45105679 #>>45105738 #>>45105766 #>>45105797 #>>45105848 #>>45105855 #>>45105915 #>>45105960 #>>45105963 #>>45105985 #>>45106070 #>>45106096 #>>45106150 #>>45106272 #>>45106285 #>>45106679 #>>45106851 #>>45106897 #>>45106940 #>>45107085 #>>45107239 #>>45107242 #>>45107347 #>>45107622 #>>45107915 #>>45108298 #>>45108477 #>>45109495 #>>45110545 #>>45110824 #>>45110882 #>>45111336 #>>45111695 #>>45111885 #>>45111904 #>>45111971 #>>45112441 #>>45112552 #>>45113827 #
AlexandrB ◴[] No.45107239[source]
The whole LLM era is horrible. All the innovation is coming "top-down" from very well funded companies - many of them tech incumbents, so you know the monetization is going to be awful. Since the models are expensive to run it's all subscription priced and has to run in the cloud where the user has no control. The hype is insane, and so usage is being pushed by C-suite folks who have no idea whether it's actually benefiting someone "on the ground" and decisions around which AI to use are often being made on the basis of existing vendor relationships. Basically it's the culmination of all the worst tech trends of the last 10 years.
replies(12): >>45107334 #>>45107517 #>>45107684 #>>45107685 #>>45108349 #>>45109055 #>>45109547 #>>45109687 #>>45111383 #>>45112507 #>>45112534 #>>45114113 #
atleastoptimal ◴[] No.45109055[source]
Nevertheless, prices for LLM at any given level of performance have gone down precipitously over the past few years. Regardless of how bad it seems the decisions being made are, the decision making process both is making an extreme amount of money for those in the AI companies, and providing extremely cheap and high quality intelligence for those using their offerings.
replies(1): >>45109158 #
1. pimlottc ◴[] No.45109158[source]
Remember when you could get an Uber ride all the way across town for $5? It is way too early to know what prices for these services will actually cost.
replies(1): >>45113128 #
2. atleastoptimal ◴[] No.45113128[source]
Is there an open source Uber? There are multiple open source AI models far beyond what SOTA was just 1 year ago. Even if they don't manage to drive prices down on the most recent closed models, they themselves will never be a trivial amount more than the compute they run on, and compute will only get more expensive if demand for AI continues to grow exponentially, which would likewise drive prices down due to competitive pressure.
replies(1): >>45125891 #
3. xigoi ◴[] No.45125891[source]
> There are multiple open source AI models far beyond what SOTA was just 1 year ago.

There are many models that call themselves open source, but the source is nowhere to be found, only the weights.