/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Ollama's new engine for multimodal models
(ollama.com)
343 points
LorenDB
| 4 comments |
16 May 25 01:43 UTC
|
HN request time: 0.21s
|
source
Show context
buyucu
[dead post]
◴[
16 May 25 07:19 UTC
]
No.
44002611
[source]
▶
>>44001087 (OP)
#
[flagged]
hexmiles
◴[
16 May 25 10:26 UTC
]
No.
44003680
[source]
▶
>>44002611
#
If I understood it correctly: this time no, it is actually new engine builded by the ollama team indipendent from llama.cpp
replies(2):
>>44003746
#
>>44003749
#
1.
buyucu
◴[
16 May 25 10:37 UTC
]
No.
44003749
[source]
▶
>>44003680
#
I doubt it. Llama.cpp just added support for the same models a few weeks ago. Folks at ollama just did a git pull.
replies(2):
>>44004061
#
>>44004120
#
ID:
GO
2.
magicalhippo
◴[
16 May 25 11:28 UTC
]
No.
44004061
[source]
▶
>>44003749 (TP)
#
It's open source, you could have checked. Seems indeed like the new engine cuts out llama.cpp, using GGML libary directly.
https://github.com/ollama/ollama/pull/7913
replies(1):
>>44005517
#
3.
◴[
16 May 25 11:35 UTC
]
No.
44004120
[source]
▶
>>44003749 (TP)
#
4.
buyucu
◴[
16 May 25 13:49 UTC
]
No.
44005517
[source]
▶
>>44004061
#
seriously? who do you think develops ggml?
hint: it's llama.cpp
↑