/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Ollama 0.4 is released with support for Meta's Llama 3.2 Vision models locally
(ollama.com)
137 points
BUFU
| 2 comments |
06 Nov 24 21:10 UTC
|
HN request time: 0.001s
|
source
1.
inasring
◴[
06 Nov 24 22:52 UTC
]
No.
42070824
[source]
▶
>>42069453 (OP)
#
Can it run the quantized models?
replies(1):
>>42071506
#
ID:
GO
2.
fallingsquirrel
◴[
06 Nov 24 23:58 UTC
]
No.
42071506
[source]
▶
>>42070824 (TP)
#
Supported quantizations:
https://ollama.com/library/llama3.2-vision/tags
↑