←back to thread

511 points andy99 | 3 comments | | HN request time: 0.217s | source
Show context
oytis ◴[] No.44536348[source]
The press release talks a lot about how it was done, but very little about how capabilities compare to other open models.
replies(2): >>44536398 #>>44536523 #
1. joot82 ◴[] No.44536523[source]
The model will be released in two sizes — 8 billion and 70 billion parameters [...]. The 70B version will rank among the most powerful fully open models worldwide. [...] In late summer, the LLM will be released under the Apache 2.0 License.

We'll find out in September if it's true?

replies(2): >>44536885 #>>44536967 #
2. k__ ◴[] No.44536885[source]
I hope DeepSeek R2, but I fear Llama 4.
3. oytis ◴[] No.44536967[source]
Yeah, I was thinking more of a table with benchmark results