Making LeCun report to Wang was the most boneheaded move imaginable. But… I suppose Zuckerberg knows what he wants, which is AI slopware and not truly groundbreaking foundation models.
replies(20):
LLMs cannot do any of the major claims made for them, so competing at the current frontier is a massive resource waste.
Right now a locally running 8b model with large context window (10k tokens+) beat google/openAI models easily on any task you like.
why would anyone then pay for something that is possible to run on consumer hardware with higher token/second throughput and better performance? What exactly have the billions invested given google/oai in return? Nothing more than an existensial crisis I'd say.
Companies aren't trying to force AI costs into their subscription models in dishonest ways because they've got a winning product.