/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Ollama 0.4 is released with support for Meta's Llama 3.2 Vision models locally
(ollama.com)
137 points
BUFU
| 1 comments |
06 Nov 24 21:10 UTC
|
HN request time: 0.211s
|
source
Show context
vasilipupkin
◴[
06 Nov 24 23:06 UTC
]
No.
42070973
[source]
▶
>>42069453 (OP)
#
how likely is it to run on a reasonably new windows laptop?
replies(1):
>>42071266
#
1.
ac29
◴[
06 Nov 24 23:35 UTC
]
No.
42071266
[source]
▶
>>42070973
#
With 16GB of RAM these vision models will run. How quickly depends on a lot of factors.
ID:
GO
↑