I don't get the sky high valuation of LLM companies. I mean I get that these guys need a lot of money for compute to train the next generation of models. But Distillation does make it easy for other providers to replicate gains made by these providers at a much lower cost.
On a long enough timeframe, the open source models will catch up to the proprietary models and inference providers will beat these proprietary companies on price.
replies(1):