←back to thread

Anthropic raises $13B Series F

(www.anthropic.com)
585 points meetpateltech | 1 comments | | HN request time: 0s | source
Show context
llamasushi ◴[] No.45105325[source]
The compute moat is getting absolutely insane. We're basically at the point where you need a small country's GDP just to stay in the game for one more generation of models.

What gets me is that this isn't even a software moat anymore - it's literally just whoever can get their hands on enough GPUs and power infrastructure. TSMC and the power companies are the real kingmakers here. You can have all the talent in the world but if you can't get 100k H100s and a dedicated power plant, you're out.

Wonder how much of this $13B is just prepaying for compute vs actual opex. If it's mostly compute, we're watching something weird happen - like the privatization of Manhattan Project-scale infrastructure. Except instead of enriching uranium we're computing gradient descents lol

The wildest part is we might look back at this as cheap. GPT-4 training was what, $100M? GPT-5/Opus-4 class probably $1B+? At this rate GPT-7 will need its own sovereign wealth fund

replies(48): >>45105396 #>>45105412 #>>45105420 #>>45105480 #>>45105535 #>>45105549 #>>45105604 #>>45105619 #>>45105641 #>>45105679 #>>45105738 #>>45105766 #>>45105797 #>>45105848 #>>45105855 #>>45105915 #>>45105960 #>>45105963 #>>45105985 #>>45106070 #>>45106096 #>>45106150 #>>45106272 #>>45106285 #>>45106679 #>>45106851 #>>45106897 #>>45106940 #>>45107085 #>>45107239 #>>45107242 #>>45107347 #>>45107622 #>>45107915 #>>45108298 #>>45108477 #>>45109495 #>>45110545 #>>45110824 #>>45110882 #>>45111336 #>>45111695 #>>45111885 #>>45111904 #>>45111971 #>>45112441 #>>45112552 #>>45113827 #
jayd16 ◴[] No.45105619[source]
In this imaginary timeline where initial investments keep increasing this way, how long before we see a leak shutter a company? Once the model is out, no one would pay for it, right?
replies(6): >>45105704 #>>45105708 #>>45105778 #>>45105857 #>>45106040 #>>45112321 #
wmf ◴[] No.45105778[source]
You can't run Claude on your PC; you need servers. Companies that have that kind of hardware are not going to touch a pirated model. And the next model will be out in a few months anyway.
replies(1): >>45106298 #
jayd16 ◴[] No.45106298[source]
If it was worth it, you'd see some easy self hostable package, no? And by definition, its profitable to self host or these AI companies are in trouble.
replies(3): >>45107146 #>>45107372 #>>45109892 #
tick_tock_tick ◴[] No.45109892[source]
You need a 100+gigs ram and a top of the line GPU to run legacy models at home. Maybe if you push it that setup will let you handle 2 people maybe 3 people. You think anyone is going to make money on that vs $20 a month to anthropic?
replies(3): >>45112200 #>>45112210 #>>45112761 #
lelanthran ◴[] No.45112761{3}[source]
> You need a 100+gigs ram and a top of the line GPU to run legacy models at home. Maybe if you push it that setup will let you handle 2 people maybe 3 people.

This doesn't seem correct. I run legacy models with only slightly reduced performance on 32GB RAM with a 12GB VRAM GPU right now. BTW, that's not an expensive setup.

> You think anyone is going to make money on that vs $20 a month to anthropic?

Why does it have to be run as a profit-making machine for other users? It can run as a useful service for the entire household, when running at home. After all, we're not talking about specialised coding agents using this[1], just normal user requests.

====================================

[1] For an outlay of $1k for a new GPU I can run a reduced-performance coding LLM. Once again, when it's only myself using it, the economics work out. I don't need the agent to be fully autonomous because I'm not vibe coding - I can take the reduced-performance output, fix it and use it.

replies(2): >>45118444 #>>45122107 #
1. jayd16 ◴[] No.45118444{4}[source]
Plus, when you're hosting it yourself, you can be reckless with what you feed it. Pricing in the privacy gain, it seems like self hosting would be worth the effort/cost.