Most active commenters

    ←back to thread

    899 points georgehill | 11 comments | | HN request time: 1.268s | source | bottom
    Show context
    samwillis ◴[] No.36216196[source]
    ggml and llama.cpp are such a good platform for local LLMs, having some financial backing to support development is brilliant. We should be concentrating as much as possible to do local inference (and training) based on privet data.

    I want a local ChatGPT fine tuned on my personal data running on my own device, not in the cloud. Ideally open source too, llama.cpp is looking like the best bet to achieve that!

    replies(6): >>36216377 #>>36216465 #>>36216508 #>>36217604 #>>36217847 #>>36221973 #
    1. rvz ◴[] No.36216465[source]
    > ggml and llama.cpp are such a good platform for local LLMs, having some financial backing to support development is brilliant

    The problem is, this financial backing and support is via VCs, who will steer the project to close it all up again.

    > I want a local ChatGPT fine tuned on my personal data running on my own device, not in the cloud. Ideally open source too, llama.cpp is looking like the best bet to achieve that!

    I think you are setting yourself up for disappointment in the future.

    replies(3): >>36216838 #>>36217184 #>>36219154 #
    2. ulchar ◴[] No.36216838[source]
    > The problem is, this financial backing and support is via VCs, who will steer the project to close it all up again.

    How exactly could they meaningfully do that? Genuine question. The issue with the OpenAI business model is that the collaboration within academia and open source circles is creating innovations that are on track to out-pace the closed source approach. Does OpenAI have the pockets to buy the open source collaborators and researchers?

    I'm truly cynical about many aspects of the tech industry but this is one of those fights that open source could win for the betterment of everybody.

    replies(2): >>36217177 #>>36217454 #
    3. maxilevi ◴[] No.36217177[source]
    I agree with the spirit but saying that open source is on track to outpace OpenAI in innovation is just not true. Open source models are being compared to GPT3.5, none yet even get close to GPT4 quality and they finished that last year.
    replies(1): >>36218569 #
    4. jdonaldson ◴[] No.36217184[source]
    > I think you are setting yourself up for disappointment in the future.

    Why would you say that?

    replies(1): >>36237909 #
    5. yyyk ◴[] No.36217454[source]
    I've been going on and on about this in HN: Open source can win this fight, but I think OSS is overconfident. We need to be clear there are serious challenges ahead - ClosedAI and other corporations also have a plan, a plan that has good chances unless properly countered:

    A) Embed OpenAI (etc.) API everywhere. Make embedding easy and trivial. First to gain a small API/install moat (user/dev: 'why install OSS model when OpenAI is already available with an OS API?'). If it's easy to use OpenAI but not open source they have an advantage. Second to gain brand. But more importantly:

    B) Gain a technical moat by having a permanent data advantage using the existing install base (see above). Retune constantly to keep it.

    C) Combine with existing propriety data stores to increase local data advantage (e.g. easy access for all your Office 365/GSuite documents, while OSS gets the scary permission prompts).

    D) Combine with existing propriety moats to mutually reinforce.

    E) Use selective copyright enforcement to increase data advantage.

    F) Lobby legislators for limits that make competition (open or closed source) way harder.

    TL;DR: OSS is probably catching up on algorithms. When it comes to good data and good integrations OSS is far behind and not yet catching up. It's been argued that OpenAI's entire performance advantage is due to having better data alone, and they intend to keep that advantage.

    replies(1): >>36218897 #
    6. jart ◴[] No.36218569{3}[source]
    We're basically surviving off the scraps companies like Facebook have been tossing off the table, like LLaMA. The fact that we're even allowed and able to use these things ourselves, at all, is a tremendous victory.
    replies(1): >>36218687 #
    7. maxilevi ◴[] No.36218687{4}[source]
    I agree
    8. ljlolel ◴[] No.36218897{3}[source]
    Don’t forget chip shortages. That’s all centralized up through Nvidia, TSMC, and ASML
    9. ignoramous ◴[] No.36219154[source]
    > The problem is, this financial backing and support is via VCs, who will steer the project to close it all up again.

    A matter of when, not if. I mean, the website itself makes that much clear:

      The ggml way
      
        ...
      
        Open Core
    
        The library and related projects are freely available under the MIT license... In the future we may choose to develop extensions that are licensed for commercial use
      
        Explore and have fun!
    
        ... Contributors are encouraged to try crazy ideas, build wild demos, and push the edge of what's possible
    
    
    So, like many other "open core" devtools out there, they'd like to have their cake and eat it too. And they might just as well, like others before them.

    Won't blame anyone here though; because clearly, if you're as good as Georgi Gerganov, why do it for free?

    replies(1): >>36223453 #
    10. ukuina ◴[] No.36223453[source]
    Sounds like the SQLite model, which has been a net positive for the computing world.
    11. rvz ◴[] No.36237909[source]
    Never expect such promises to go your way, especially when VCs, angels, etc are able to control the project with their opaque terms sheet, which is why I am skeptical of this. Accepting VC, angel investment cash is no different to having another boss.

    I am expecting such high expectations like that to end in disappointment for the 'community' since the interests will now be in the VCs to head for the exit. Their actions will speak more than what they are saying on the website.