←back to thread

899 points georgehill | 4 comments | | HN request time: 1.049s | source
Show context
samwillis ◴[] No.36216196[source]
ggml and llama.cpp are such a good platform for local LLMs, having some financial backing to support development is brilliant. We should be concentrating as much as possible to do local inference (and training) based on privet data.

I want a local ChatGPT fine tuned on my personal data running on my own device, not in the cloud. Ideally open source too, llama.cpp is looking like the best bet to achieve that!

replies(6): >>36216377 #>>36216465 #>>36216508 #>>36217604 #>>36217847 #>>36221973 #
behnamoh ◴[] No.36216508[source]
I wonder if ClosedAI and other companies use the findings of the open source community in their products. For example, do they use QLORA to reduce the costs of training and inference? Do they quantize their models to serve non-subscribing consumers?
replies(2): >>36216688 #>>36217149 #
danielbln ◴[] No.36216688[source]
Not disagreeing with your points, but saying "ClosedAI" is about as clever as writing M$ for Microsoft back in the day, which is to say not very.
replies(4): >>36216958 #>>36217145 #>>36218362 #>>36218979 #
1. rafark ◴[] No.36217145[source]
I think it’s ironic that M$ made ClosedAI.
replies(1): >>36218112 #
2. replygirl ◴[] No.36218112[source]
Pedantic but that's not irony
replies(1): >>36220087 #
3. rafark ◴[] No.36220087[source]
Why do you think so? According to the dictionary, ironic could be something paradoxical or weird.
replies(1): >>36220713 #
4. nl ◴[] No.36220713{3}[source]
Well it's not paradoxical?

If one is the kind of person who writes M$ then it's pretty much expected behaviour.