←back to thread

688 points crescit_eundo | 3 comments | | HN request time: 0.017s | source
1. cjbprime ◴[] No.42144644[source]
> I ran all the open models (anything not from OpenAI, meaning anything that doesn’t start with gpt or o1) myself using Q5_K_M quantization, whatever that is.

It's just a lossy compression of all of the parameters, probably not important, right?

replies(1): >>42147420 #
2. loa_in_ ◴[] No.42147420[source]
Probably important when competing against undecimated ones from OpenAI
replies(1): >>42150126 #
3. NiloCK ◴[] No.42150126[source]
Notably: there were other OpenAI models that weren't quantize, but also performed poorly.