/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
I Self-Hosted Llama 3.2 with Coolify on My Home Server
(geek.sg)
221 points
whitefables
| 3 comments |
16 Oct 24 05:26 UTC
|
HN request time: 0s
|
source
Show context
varun_ch
◴[
16 Oct 24 07:27 UTC
]
No.
41856480
[source]
▶
>>41855886 (OP)
#
I’m curious about how good the performance with local LLMs is on ‘outdated’ hardware like the author’s 2060. I have a desktop with a 2070 super that it could be fun to turn into an “AI server” if I had the time…
replies(7):
>>41856521
#
>>41856558
#
>>41856559
#
>>41856609
#
>>41856875
#
>>41856894
#
>>41857543
#
1.
whitefables
◴[
16 Oct 24 07:40 UTC
]
No.
41856559
[source]
▶
>>41856480
#
Here's how it looks like in real time:
https://youtu.be/3vhJ6fNW8AI
replies(1):
>>41856601
#
ID:
GO
2.
thisguyagain
◴[
16 Oct 24 07:48 UTC
]
No.
41856601
[source]
▶
>>41856559 (TP)
#
What’d you use to record that? Looks really great.
replies(1):
>>41856691
#
3.
whitefables
◴[
16 Oct 24 08:01 UTC
]
No.
41856691
[source]
▶
>>41856601
#
Screen studio
↑