←back to thread

507 points martinald | 4 comments | | HN request time: 0.702s | source
Show context
sc68cal ◴[] No.45053212[source]
This whole article is built off using DeepSeek R1, which is a huge premise that I don't think is correct. DeepSeek is much more efficient and I don't think it's a valid way to estimate what OpenAI and Anthropic's costs are.

https://www.wheresyoured.at/deep-impact/

Basically, DeepSeek is _very_ efficient at inference, and that was the whole reason why it shook the industry when it was released.

replies(7): >>45053283 #>>45053303 #>>45053401 #>>45053455 #>>45053507 #>>45053923 #>>45054034 #
phillipcarter ◴[] No.45053455[source]
Uhhh, I'm pretty sure DeepSeek shook the industry because of a 14x reduction in training cost, not inference cost.

We also don't know the per-token cost for OpenAI and Anthropic models, but I would be highly surprised if it was significantly more expensive than open models anyone can use and run themselves. It's not like they're also not investing in inference research.

replies(3): >>45053857 #>>45053879 #>>45053974 #
1. gmd63 ◴[] No.45053974[source]
DeepSeek was trained with distillation. Any accurate estimate of training costs should include the training costs of the model that it was distilling.
replies(1): >>45054081 #
2. ffsm8 ◴[] No.45054081[source]
That makes the calculation nonsensical, because if you go there... you'd also have to include all energy used in producing the content the other model providers used. So now suddenly everyones devices on which they wrote comments on social media, pretty much all servers to have ever served a request to open AI/Google/anthropics bots etc pp

Seriously, that claim was always completely disingenuous

replies(2): >>45055659 #>>45062012 #
3. gmd63 ◴[] No.45055659[source]
I don't think it's that nonsensical to realize that in order to have AI, you need generations of artists, journalists, scientists, and librarians to produce materials to learn from.

And when you're using an actual AI model to "train" (copy), it's not even a shred of nonsense to realize the prior model is a core component of the training.

4. jaakl ◴[] No.45062012[source]
Not just energy cost, but also licensing cost of all this content…