←back to thread

507 points martinald | 1 comments | | HN request time: 0.215s | source
Show context
sc68cal ◴[] No.45053212[source]
This whole article is built off using DeepSeek R1, which is a huge premise that I don't think is correct. DeepSeek is much more efficient and I don't think it's a valid way to estimate what OpenAI and Anthropic's costs are.

https://www.wheresyoured.at/deep-impact/

Basically, DeepSeek is _very_ efficient at inference, and that was the whole reason why it shook the industry when it was released.

replies(7): >>45053283 #>>45053303 #>>45053401 #>>45053455 #>>45053507 #>>45053923 #>>45054034 #
1. GaggiX ◴[] No.45053303[source]
The "efficiency" meantioned in blog post you have linked is the price difference between Deepseek and o1, it doesn't mean that GPT-5 or other SOTA models are less efficient.