/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
I Self-Hosted Llama 3.2 with Coolify on My Home Server
(geek.sg)
221 points
whitefables
| 3 comments |
16 Oct 24 05:26 UTC
|
HN request time: 0.001s
|
source
Show context
cranberryturkey
◴[
16 Oct 24 11:06 UTC
]
No.
41857733
[source]
▶
>>41855886 (OP)
#
How is coolify different than ollama? is it better? worse? I like ollama because I can pull models and it exposes a rest api to me. which is great for development
replies(2):
>>41858072
#
>>41858087
#
1.
grahamj
◴[
16 Oct 24 12:03 UTC
]
No.
41858087
[source]
▶
>>41857733
#
Might want to skim the article
replies(1):
>>41858095
#
ID:
GO
2.
cranberryturkey
◴[
16 Oct 24 12:04 UTC
]
No.
41858095
[source]
▶
>>41858087 (TP)
#
i did. just realized its a totally different tool for deploying apps.
replies(1):
>>41858562
#
3.
grahamj
◴[
16 Oct 24 12:57 UTC
]
No.
41858562
[source]
▶
>>41858095
#
fwiw that was my reaction to the title too :D
↑