←back to thread

1311 points msoad | 1 comments | | HN request time: 0.206s | source
Show context
yodsanklai ◴[] No.35394809[source]
Total noob questions.

1. How does this compare with ChatGPT3

2. Does it mean we could eventually run a system such as ChatGPT3 on a computer

3. Could LLM eventually replace Google (in the sense that answers could be correct 99.9% of the time) or is the tech inherently flawed

replies(4): >>35394861 #>>35394958 #>>35395176 #>>35396372 #
1. retrac ◴[] No.35395176[source]
The largest LLaMA model at ~60 billion parameters is not quite as large as ChatGPT 3 in size, and probably not quite as well trained, but it's basically in the same class. Even the complete, not quantitized model, can be run with llama.cpp on ARM64 and x86_64 CPUs already, assuming you have enough RAM (128 GB?).