←back to thread

16 points c990802 | 1 comments | | HN request time: 0.202s | source

Hey HN! We are happy to share with you our latest project: Llama Workspace (https://llamaworkspace.ai).

It's an open-source ChatGPT Teams alternative made with the needs of businesses and organizations in mind.

It has everything you can expect from ChatGPT, plus first class support for users and roles management, as well as advanced collaboration features. We've made it very easy to self-host anywhere you like, and we also provide a cloud version. So no excuse for trying it out!

Why using Llama Workspace instead of ChatGPT Teams?

- You have access to all the major Large Language models in one place, including GPT-4, Claude Sonnet or Gemini. - It brings savings of about 70% - 85% when compared to ChatGPT Teams (see website for more info on this). - You can integrate it with your own code, like AI agents, and provide a single platform to access AI.

Here's the Github link: https://github.com/llamaworkspace/llamaworkspace

I would love to hear your ideas, experience, and feedback about the product! What should we implement next?

1. ganeshkrishnan ◴[] No.41910567[source]
>Llama Workspace connects to AI providers like OpenAI and Anthropic via API. In the API, each word for the most commonly used model (GPT-4o at the time of writing) costs approximately $0.000025. Therefore, to break even with the ChatGPT Teams license, which costs $30/seat/month, you would need to generate 1,200,000 words.

I would debate this one. As a "normal" user of chatgpt with around 10-20 questions a day, I can exceed this $30 per month because

1) it's not a word per se but token. A lot of the words are tokenized.

2) for each chat, for each question all the context above is re-sent and counts for cost.

I think pivoting towards self hosting llama models and charging for them might be more profitable to you and to the users.