/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Ollama's new engine for multimodal models
(ollama.com)
344 points
LorenDB
| 2 comments |
16 May 25 01:43 UTC
|
HN request time: 1.554s
|
source
Show context
tommica
◴[
16 May 25 05:10 UTC
]
No.
44002018
[source]
▶
>>44001087 (OP)
#
Sidetangent: why is ollama frowned upon by some people? I've never really got any other explanation than "you should run llama.CPP yourself"
replies(9):
>>44002029
#
>>44002150
#
>>44002166
#
>>44002486
#
>>44002513
#
>>44002621
#
>>44004218
#
>>44005337
#
>>44006200
#
1.
gavmor
◴[
16 May 25 05:45 UTC
]
No.
44002166
[source]
▶
>>44002018
#
Here's a recent thread on Ollama hate from r/localLLaMa:
https://www.reddit.com/r/LocalLLaMA/comments/1kg20mu/so_why_...
replies(1):
>>44006942
#
ID:
GO
2.
kergonath
◴[
16 May 25 15:55 UTC
]
No.
44006942
[source]
▶
>>44002166 (TP)
#
r/localLLaMa is very useful, but also very susceptible to groupthink and more or less astroturfed hype trains and mood swings. This drama needs to be taken in context, there is a lot of emotion and not too much reason.
↑