>Llama Workspace connects to AI providers like OpenAI and Anthropic via API. In the API, each word for the most commonly used model (GPT-4o at the time of writing) costs approximately $0.000025. Therefore, to break even with the ChatGPT Teams license, which costs $30/seat/month, you would need to generate 1,200,000 words.
I would debate this one. As a "normal" user of chatgpt with around 10-20 questions a day, I can exceed this $30 per month because
1) it's not a word per se but token. A lot of the words are tokenized.
2) for each chat, for each question all the context above is re-sent and counts for cost.
I think pivoting towards self hosting llama models and charging for them might be more profitable to you and to the users.