←back to thread

688 points crescit_eundo | 2 comments | | HN request time: 0.602s | source
Show context
cjbprime ◴[] No.42144644[source]
> I ran all the open models (anything not from OpenAI, meaning anything that doesn’t start with gpt or o1) myself using Q5_K_M quantization, whatever that is.

It's just a lossy compression of all of the parameters, probably not important, right?

replies(1): >>42147420 #
1. loa_in_ ◴[] No.42147420[source]
Probably important when competing against undecimated ones from OpenAI
replies(1): >>42150126 #
2. NiloCK ◴[] No.42150126[source]
Notably: there were other OpenAI models that weren't quantize, but also performed poorly.