←back to thread

221 points whitefables | 1 comments | | HN request time: 0.29s | source
Show context
taosx ◴[] No.41856567[source]
For the people who self-host LLMs at home: what use cases do you have?

Personally, I have some notes and bookmarks that I'd like to scrape, then have an LLM summarize, generate hierarchical tags, and store in a database. For the notes part at least, I wouldn't want to give them to another provider; even for the bookmarks, I wouldn't be comfortable passing my reading profile to anyone.

replies(11): >>41856653 #>>41856701 #>>41856881 #>>41856970 #>>41856992 #>>41857395 #>>41858199 #>>41858353 #>>41861443 #>>41864562 #>>41890288 #
1. ein0p ◴[] No.41856992[source]
I run Mistral Large on 2xA6000. 9 times out of 10 the response is the same quality as GPT 4o. My employer does not allow the use of GPT for privacy related reasons. So I just use a private Mistral for that