←back to thread

DeepSeek-v3.1

(api-docs.deepseek.com)
776 points wertyk | 1 comments | | HN request time: 0.213s | source
Show context
rsanek ◴[] No.44980753[source]
Looks to be the ~same intelligence as gpt-oss-120B, but about 10x slower and 3x more expensive?

https://artificialanalysis.ai/models/deepseek-v3-1-reasoning

replies(5): >>44981187 #>>44981737 #>>44981789 #>>44982171 #>>44982769 #
1. mdp2021 ◴[] No.44982171[source]
> same intelligence as gpt-oss-120B

Let's hope not, because gpt-oss-120B can be dramatically moronical. I am guessing the MoE contains some very dumb subnets.

Benchmarks can be a starting point, but you really have to see how the results work for you.