←back to thread

1479 points sandslash | 1 comments | | HN request time: 0.21s | source
Show context
khalic ◴[] No.44317209[source]
His dismissal of smaller and local models suggests he underestimates their improvement potential. Give phi4 a run and see what I mean.
replies(5): >>44317248 #>>44317295 #>>44317350 #>>44317621 #>>44317716 #
diggan ◴[] No.44317350[source]
> suggests a lack of understanding of these smaller models capabilities

If anything, you're showing a lack of understanding of what he was talking about. The context is this specific time, where we're early in a ecosystem and things are expensive and likely centralized (ala mainframes) but if his analogy/prediction is correct, we'll have a "Linux" moment in the future where that equation changes (again) and local models are competitive.

And while I'm a huge fan of local models run them for maybe 60-70% of what I do with LLMs, they're nowhere near proprietary ones today, sadly. I want them to, really badly, but it's important to be realistic here and realize the differences of what a normal consumer can run, and what the current mainframes can run.

replies(2): >>44317720 #>>44317744 #
1. khalic ◴[] No.44317744[source]
I edited to make it clearer