Remember the revolutionary, seemingly inevitable tech that was poised to rewrite how humans thought about transportation? The incredible amounts of hype, the secretive meetings disclosing the device, etc.? That turned out to be the self-balancing scooter known as a Segway?
2. Segways were just ahead of their time: portable lithium-ion powered urban personal transportation is getting pretty big now.
The Segway always had a high barrier to entry. Currently for ChatGPT you don't even need an account, and everyone already has a Google account.
It is even cheaper to serve an LLM answer than call a web search API!
Zero chance all the users evaporate unless something much better comes along, or the tech is banned, etc...
> It is even cheaper to serve an LLM answer than call a web search API
These, uhhhh, these are some rather extraordinary claims. Got some extraordinary evidence to go along with them?
Anecdotally thanks to hardware advancements the locally-run AI software I develop has gotten more than 100x faster in the past year thanks to Moore's law
And nothing I've seen about recent GPUs or TPUs, from ANY maker (Nvidia, AMD, Google, Amazon, etc) say anything about general speedups of 100x. Heck, if you go across multiple generations of what are still these very new types of hardware categories, for example for Amazon's Inferentia/Trainium, even their claims (which are quite bold), would probably put the most recent generations at best at 10x the first generations. And as we all know, all vendors exaggerate the performance of their products.