What would be interesting to see is what those kids produced with their vibe coding.
What would be interesting to see is what those kids produced with their vibe coding.
GitHub copilot has a free tier.
Google gives you thousands of free LLM API calls per day.
There are other free providers too.
Sure, nobody can predict the long-term economics with certainty but companies like OpenAI already have compelling business fundamentals today. This isn’t some scooter startup praying for margins to appear; it’s a platform with real, scaled revenue and enterprise traction.
But yeah, tell me more about how my $200/mo plan is bankrupting them.
You are correct that some providers might reduce prices for market capture, but the alternatives are still cheap, and some are close to being competitive in quality to the API providers.
Not everyone has to paid that cost, as some companies are releasing weights for download and local use (like Llama) and then some other companies are going even further and releasing open source models+weights (like OLMo). If you're a provider hosting those, I don't think it makes sense to take the training cost into account when planning your own infrastructure.
Although I don't it makes much sense personally, seemingly it makes sense for other companies.
As the LLM developers continue to add unique features to their APIs, the shared API which is now OpenAI will only support the minimal common subset and many will probably deprecate the compatibility API. Devs will have to rely on SDKs to offer comptibility.
It going to get wild when the tech bro investors demand ads be the included in responses.
It will be trivial for a version of AdWords where someone pays for response words be replaced. “Car” replaced by “Honda”, variable names like “index” by “this_index_variable_is_sponsered_by_coinbase” etc.
I’m trying to be funny with the last one but something like this will be coming sooner than later. Remember, google search used to be good and was ruined by bonus seeking executives.
But the majority of them are serving at ~ the same price, and that matches to the raw cost + some profit if you actually look into serving those models. And those prices are still cheap.
So yeah, I stand by what I wrote, "most likely" included.
My main answer was "no, ..." because the gp post was only considering the closed providers only (oai, anthropic, goog, etc). But youc an get open-weight models pretty cheap, and they are pretty close to SotA, depending on your needs.
At least for now, LLM APIs are just JSONs with a bunch of prompts/responses in them and maybe some file URLs/IDs.
So? That's true for search as well, and yet Google has been top-dog for decades in spite of having worse results and a poorer interface than almost all of the competition.
They dont have to retrain constantly and that’s where opinions like yours fall short. I don’t believe anyone has a concrete vision on the economics in the medium to a long term. It’s biased ignorance to hold a strong position in the down or up case.